'How to upload multiple photos on aws at once?
I use react-native with graphQL.
Now I can upload one single photo on aws successfully.
But I want to upload multiple files at once.
If I run loop I can successfully upload multiple files on aws as below.
const onValid = ({ caption }) => {
const uploadPhotoArray = selectPhoto.map((sp, index) => {
return new ReactNativeFile({
uri: sp,
name: `${index}.jpg`,
type: "image/jpeg",
});
});
for (let i = 0; i < uploadPhotoArray.length; i++) {
uploadPhotoMutation({
variables: {
caption,
file: uploadPhotoArray[i],
},
});
}
};
But the problem is, if I upload 2 images, then it creates two rows on aws and backend.
In order to make it one row (post) with 2 images, I make file column of backend as Array, not string.
However I think the problem is the frontend.
I wanted to make uploadPhotoArray as array like below.
Array [
ReactNativeFile {
"name": "0.jpg",
"type": "image/jpeg",
"uri": "file:///storage/emulated/0/DCIM/Camera/20220306_020216.jpg",
},
ReactNativeFile {
"name": "1.jpg",
"type": "image/jpeg",
"uri": "file:///storage/emulated/0/DCIM/Camera/20220305_201130.jpg",
},
]
then tried to run uploadPhotoMutation with this array.
uploadPhotoMutation({
variables: {
caption,
file: uploadPhotoArray,
},
});
Then it will pass array data to backend.
But it seems now working.
If I can't pass Array data to backend, which means that one by one is possible, then I need to make this incoming data to array on backend.
But that's also hard for me.
If you want to clarify my question, I can answer in real-item and chat is also possible. Please give me any idea. :(
backend code
const fileUrl = fileArrayCheck
? await uploadFileToS3(file, loggedInUser.id, "uploads")
: await uploadStringleFileToS3(file, loggedInUser.id, "uploads");
export const uploadStringleFileToS3 = async (file, userId, folderName) => {
console.log(file);
AWS.config.update({
credentials: {
accessKeyId: process.env.AWS_KEY,
secretAccessKey: process.env.AWS_SECRET,
},
});
const { filename, createReadStream } = await file;
const readStream = createReadStream();
const objectName = `${folderName}/${userId}-${Date.now()}-${filename}`;
const { Location } = await new AWS.S3()
.upload({
Bucket: "chungchunonuploads",
Key: objectName,
ACL: "public-read",
Body: readStream,
})
.promise();
return [Location];
};
export const uploadFileToS3 = async (filesToUpload, userId, folderName) => {
const uploadPromises = await filesToUpload.map((file) => {
uploadStringleFileToS3(file, userId, folderName);
});
return Promise.all(uploadPromises);
};
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
