'How to find the biggest files in all S3 buckets you have

My client has 60+ buckets, we pay hundreds of dollars per months to store this data and we don't know how to easily distinguish what is useful from what is legacy.

Clicking on each bucket and finding what is taking space is tedious.

Is there a way to first list all files from all buckets and find what is taking the most space, so we can clear what is old and big?



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source