Recursive scp without following links or creating a giant tar file

scpsymlink

So I did a recursive scp on my remote fileserver (in another state) and it created an infinite loop of links on my remote web directory…

http://www.linuxquestions.org/questions/linux-general-1/recursive-scp-w-o-following-links-658857/ says that I can can try creating a giant tar file. There is a problem with this though – I'm running the recursive scp on a Linux machine in my office, and I'm copying the files all to my external hard drive, which is in FAT32 format (because I need something that's readable by both UNIX and Windows). FAT32 doesn't support large filesizes. So I would have to try something different.

There's also a rsync option but the Linux machine in my office is very primitive (it's igel) so it doesn't have rsync…

Best Answer

I would not recommend using scp for transferring large file trees directly, because it does not handle neither hard nor soft links properly, also the stream is not compressed.

I'd recommend cpio with (de)compression on the fly:

ssh user@host "cd /path/to/files && find . | cpio -ov | bzip2 -c" | bunzip2 -c | cpio -ivd

Also, find can handle additional conditions, like "files must be less than 4G"

find . -size -4G | ...

To make cpio more space-friendly (to handle spaces in file names properly) use

find . -print0 | cpio -0 -ivd | ...
Related Question