'How to limit memory usage during tar

I need to tar (or otherwise zip) more than 2.7 million files. (150GB)

However, with this many files the tar command uses way too much memory and my system crashes. What can I do?

tar -cf /path/filename.tar /path_to_file/

I've tried to do it in batches (multiple tar files would be OK) based on file creation date and find, but find takes up even more memory.



Solution 1:[1]

Not sure if this is an answer exactly, as it doesn't say how to explicitly lower tar's memory usage, but...

I think you can specify the compression program used with tar to use pigz (parallel gzip), then specify the number of threads to use to better help manage memory. Maybe something like:

tar cvf - paths-to-archive | pigz -p 4 > archive.tar.gz

where -p $NUM is the number of cores.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Yuri