'How to loop an array of links and get json response in Node.js
I'm accessing an url that gives me a json (https://collections.louvre.fr/ark:/53355/cl010270063.json).
The problem is that I have an array of 448 thousand links to access and getting one by one will take weeks.
I've tried this:
for (let i = 0; i < artLinks.length; i += chunkSize) {
const chunk = artLinks.slice(i, i + chunkSize);
let resArray = []
// Return our response in the allData variable as an array
Promise.all(chunk.map((endpoint) => axios.get(`https://collections.louvre.fr/en/ark:/${endpoint}.json`))).then((res) =>
resArray.push(res.data));
const filePath = path.join(__dirname, `../../json/artInfo-${i}.json`);
await fs.writeFileSync(filePath, JSON.stringify(resArray))
bar.tick()
}
So I could write a file with the information every 100 responses or so. But this will create an empty file and the loop will fire all links at once, and not wait the response and then proceed to the next iteration.
How can I loop an enormous array of links and get it's response in a non dumb way?
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
