Normally I would just do something like:
tar -czf archive.tar.gz *.csv
But when there are too many files in the directory for the shell to expand on a single line this doesn't work.
In these cases I would normally resort to using find
. Something like:
find /path -name '*.csv' -exec tar -rf "./archive.tar.gz" {} +;`
But this only seems to work if I don't include the -z
option because you can't append to compressed archives, and using -c
instead of -r
will overwrite the first archive since find runs tar multiple times.
The only other solution I could come up with is to create a .tar file with find
(as above) and then use a second command to compress it. Is there a better way to handle cases like this?
I'm using Ubuntu Linux.
Best Answer
As a robust solution, use
find
to separate filenames by a null character, and then pipe directly totar
, which reads null-delimited input:This will now handle all file names correctly and is not limited by the number of files either.
Using
ls
to generate a list of filenames to be parsed by another program is a common antipattern that should be avoided whenever possible.find
can generate null-delimited output (-print0
) that most utilities can read or parse further. Since the null character is the only character that cannot appear in a filename (and the/
, obviously), you'll always be safe with that.