SCP – Best Methods to Copy Large Numbers of Small Files

scptar

I have a directory that's got several gigabytes and several thousand small files. I want to copy it over the network with scp more than once. CPU time on the source and destination machines is cheap, but the network overhead added by copying each file individually is huge. I would tar/gzip it up and ship it over, but the source machine is short on disk.

Is there a way for me to pipe the output of tar -czf <output> <directory> to scp? If not, is there another easy solution? My source machine is ancient (SunOS) so I'd rather not go installing things on it.

Best Answer

You can pipe tar across an ssh session:

$ tar czf - <files> | ssh user@host "cd /wherever && tar xvzf -"