This link describes how to copy tarred files to minimise the amount of date sent over the network. I am trying to do something slightly different.
I have a number of remote files on different subdirectory levels:
remote:/directory/subdir1/file1.ext
remote:/directory/subdir1/subsubdir11/file11.ext
remote:/directory/subdir2/subsubdir21/file21.ext
And I have a file that lists all of them:
remote:/directory/allfiles.txt
To copy them most efficiently, on the remote site I could just do
tar zcvf allfiles.tgz `cat /directory/allfiles.txt`
but there is not enough space to do that.
I do have enough storage space on my local disc. Is there a way to tar
an incoming stream from a remote server (using scp
or ssh
for the transfer)?
Something like /localdir$ tar zc - | ssh remote
cat /directory/allfiles.txt` I would guess – but that would only list the remote files on the local host.
Best Answer
You got it almost right, just run the tar on the remote host instead of locally. The command should look something like the following: