The most *robust* remote file copy

file-copyfilesremotersync

How would I go about to copy files over a very unstable internet connection?

Sometimes, the connection is lost, other times the IP of one machine or the other machine is changed, sometimes both, though dynamic DNS will catch it.

Which tool or command would you suggest?

I've heard that rsync is pretty nifty in copying only the diff, but that means a lot of work either restarting it again and again or putting it into a while or cronjob.

I was hoping for something easier and foolproof.

Addendum:

It's about copying every now and then a couple of directories with a few very large files >5GB in them from one site to the other. After the copy, both are moved locally to different locations.

I can't do anything on the networking level, I wouldn't have the knowledge to do so.

I'd rather not set up a web server in order to use wget. That is not secure and seems like a circuitous route.

I have already established an SSH connection and could now rsync, as rsync is already installed on both machines (I wouldn't be able to get an rsync daemon up and running).

Any hints on how I could make an intelligent rsync over ssh so that it tries to continue when the line is temporarily cut? But rsync won't be the problem when the ssh connection dies. So something like this (https://serverfault.com/questions/98745/) probably won't work:

while ! rsync -a .... ; do sleep 5 ; done

Any ideas?

Thanks a lot!

Gary

Best Answer

I would definitely suggest rsync. I use rsync to copy files anytime I think that the connection has any possibility of being interrupted. If the copy fails, I know I can simply start it again.

It's easy to put it in a while loop if you need it to automatically restart until it succeeds.

Related Question