'Uploading Big JSON to S3 from lambda
I am preparing a big JSON by reading from DB in Lambda which is around 1Gb. Now I want to upload that JSON to S3. My Lambda is getting timedout. Timelines of my program Generating or Preparing JSON takes around 6mins and there is more 9mins left which is not sufficient for JSON uploading. I am using JsonStreamStringify to stream the JSON to AWS S3 from Lambda.
Any thoughts of doing it a better way or are there any other things that I can use ?
Here is the sample code snippet of How I upload JSON from lambda
const uploadStream = ({ Bucket, Key }) => {
const pass = new stream.PassThrough();
let params = {
Bucket,
Key,
Body: pass,
ContentType: 'application/json; charset=utf-8',
ServerSideEncryption: 'AES256',
};
return {
writeStream: pass,
promise: s3.upload(params).promise(),
};
};
const { writeStream, promise } = uploadStream({
Bucket: conf.customerS3Settings.bucketName,
Key: filePath,
});
const jsonStream = new JsonStreamStringify(reportData.reportJson);
jsonStream.once('error', () => console.log('Error at path', jsonStream.stack.join('.')));
jsonStream.once('end', (res, res1) => {
console.log('JSON stringified and uploaded in json export : ', res, res1);
});
jsonStream.pipe(writeStream);
let uploadResponse = await promise;
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
