'NodeJs Upload csv string content to s3
I have tried to upload csv content data to s3 but its some how not working Below My code to store csv data into csv.gz file
var filename= "ff1.csv.gz"
var csvData = "username,email,year,month\n"
csvData = csvData + username +","+email+","+year_data+","+month_data+"\n"
var paramsu = {
Bucket: bucket1,
Key: filename,
Body: csvData,
ContentType: 'application/octet-stream'
};
var res = await s3.putObject(paramsu).promise();
It is showing corrupted gzfile not able to extract
Any help would really save my day
Solution 1:[1]
if you don't have the file ready, you can create a writestream and use it to pass to s3 sdk.
The question is answered here https://stackoverflow.com/a/50291380/2688699
Solution 2:[2]
WE can Upload IMAGE/ CSV/ EXCEL files to AWS s3 using multer-s3.
Im using .single(fieldname) method for uploading single file.
const aws = require('aws-sdk');
const multer = require('multer');
const multerS3 = require('multer-s3');
const s3 = new aws.S3({
accessKeyId: process.env.AWS_ACCESS_KEY,
secretAccessKey: process.env.AWS_SECRET_KEY,
region: process.env.REGION,
});
const upload = multer({
storage: multerS3({
s3: s3,
bucket: process.env.AWS_S3_BUCKET,
metadata: function (req, file, cb) {
cb(null, {fieldName: 'Meta_Data'});
},
key: function (req, file, cb) {
cb(null, file.originalname);
},
limits: {
fileSize: 1024 * 1024 * 5 // allowed only 5 MB files
}
})
}).single('file');
exports.uploadfile = async(req,res,next)=>{
try{
upload(req,res, function(err){
if(err){
console.log(err);
}
console.log(req.file.location);
})
})
}catch (err){
res.status(400).json({
status : 'fail',
message : err.message
});
}
}
In Routes file
router.route('/')
.post(imageController.uploadfile);
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | Tuan Anh Tran |
Solution 2 | sai krishna |