'How To Circumvent 504 Errors
I am working in ReactJs and one of the main aspects of our project is the ability to upload a scorecard and have all of its results parsed and placed into objects. However, due to the nature of these pdfs that get uploaded, there's a LOT of information, an average of 12-14 pages. Most of the information is irrelevant, I usually will only need pages 5-7, but users will be users, and they upload all 12. I am using the pdfParser API which is very good, we're not looking for replacements on that. However, due to how large the file is, if I am somewhere with only half-decent connection, I am hit with a 504 error since the process takes so long. If I have good to great connection, there's no issue. This being said I have two questions:
- Is there a way to extend the amount of time that needs to elapse before my computer gives up on the process
- Is there a way to parse only SOME of the pages that get submitted?
The relevant code will be shown below...
var url = 'https://pdftables.com/api?key=770oukvvx1wl&format=xlsx-single';
const pdfToExcel = (pdfFile) => {
var req = request.post({encoding: null, url: url}, async function (err, resp, body) {
if (!err && resp.statusCode == 200) {
fs.writeFile(`${pdfFile.path}.xlsx`, body, function(err) {
if (err) {
console.log('error writing file');
}
});
} else {
console.log('error retrieving URL');
};
});
var form = req.form();
form.append('file', fs.createReadStream(`./${pdfFile.path}`));
}
const parseExcel = async (file) => {
let workSheetsFromFile;
if (file.path.search(".xlsx") === -1) {
const filePath = await path.resolve(`./${file.path}.xlsx`)
workSheetsFromFile = await xlsx.parse(`./${file.path}.xlsx`);
await fs.unlinkSync(`./${file.path}`)
await fs.unlinkSync(filePath)
return workSheetsFromFile[0].data
}
if (file.path.search(".xlsx") !== -1) {
const filePath = await path.resolve(`./${file.path}`)
workSheetsFromFile = await xlsx.parse(`./${file.path}`);
await fs.unlinkSync(filePath)
return workSheetsFromFile[0].data
}
}
Solution 1:[1]
Yes there is a way, but that doesn't mean you should do it this way.
You could for example save your values as a json string and save them inside the column. If you later want to add a value you can simply parse the json, add the value and put it back into the database. (Might also work with a BLOB, but I'm not sure)
However, I would not recommend saving a list inside of a column, as SQL is not meant to be used like that.
What I would recommend is that you have a table and for every grade with its own primary key. Like this:
| ID | Grade |
|---|---|
| 1 | Elementary |
| 2 | Guidance |
| 3 | High school |
And then another table containing all the names, having its own primary key and the gradeId as its secondary key. E.g:
| ID | GradeID | Name |
|---|---|---|
| 1 | 1 | Kai |
| 2 | 1 | Matthew |
| 3 | 1 | Grace |
| 4 | 2 | Eli |
| 5 | 2 | Zoey |
| 6 | 2 | David |
| 7 | 2 | Nora |
| 8 | 2 | William |
| 9 | 3 | Emma |
| 10 | 3 | James |
| 11 | 3 | Levia |
| 12 | 3 | Sophia |
If you want to know more about this, you should read about Normalization in SQL.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | jstnklnr |
