'Properly storing, archiving and deleting TBs of data everyday - PostgreSQL

I have a system that creates a lot of data, around 3-4 TB per day. It receives data from multiple sources 24 hours per day. I also have a replica of this system which would allow me to have some downtime if needed...

I only need to have the latest 24h of data available, the rest I can compress and store in hdds.

How would you tackle this problem? Does postgresql offer me enough features to tackle it without the need of external software?



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source