I need to compress a directory with around 350,000 fairly small files that amount to about 100GB total. I am using OSX and am currently using the standard "Compress" tool that converts this directory into a .zip file. Is there a faster way to do this?
Macos – the fastest compression method for a large number of files
compressiongzipmacostarzip
Related Question
- Windows – The compression cannot be performed because the path to the file or directory ‘Application Data’ is too long
- Zip files without compression
- Linux – gzip several files in different directories and copy to new directory
- What would be faster, transferring a zipped file to flash drive then decompress there or transferring the unzipped files
Best Answer
For directories I'd use a
tar
piped tobzip2
with max-compression.a simple way to go is,
This works great if you don't intend to fetch small sets of files out of the archive
and are just planning to extract the whole thing whenever/wherever required.
Yet, if you do want to get a small set of files out, its not too bad.
I prefer to call such archives
filename.tar.bz2
and extract with the 'xfj
' option.The max-compression pipe looks like this,
Note: the '
bzip2
' method and more compression tends to be slower than regulargzip
from 'tar cfz
'.If you have a fast network and the archive is going to be placed on a different machine,
you can speed up with a pipe across the network (effectively using two machines together).
Some references,
Dennis