'How to optimally download a list of files from s3

I have a java application that works with files stored in a s3 bucket. At times I need to download multiple files for my application to do work. Currently I have a simple for loop for this job as bellow:

private List<JSONObject> getFilesFromS3(List<String> filenames) throws IOException {
    List<JSONObject> output = new ArrayList<>();
    for(String item : filenames) {
        S3Object s3Object = s3Service.getFile(item);
        S3ObjectInputStream is = s3Object.getObjectContent();
        String str = new String(is.readAllBytes());
        output.add(XML.toJSONObject(str));
    }
    return output;
}

As you can imagine for a few files it is manageable. But for 10,20,30 files that would make an app rather slow. I wonder if there is some s3 lib or method that can speed this up?

I found TransferManager which almost does what I need - unfortunately it only offers to download the entire directory, which is not an option seeing as my s3 bucket has ~50k files. Downloading 50k files to find specific 10 seems not worth it.

So what else can I try? I have encountered methods like .listobjects() before but they all would require some sort of a loop so I assume they can't help me here.

Is there something else that I missed, or simply what I want cannot be done/ has not been invented yet? I effectively would like something that would produce a list of S3Objects (which as you can see I would then make into a list of JSONObject items)



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source