Linux – Is it possible to download extremely large files intelligently or in parts via SSH from Linux to Windows

downloadfile-transferlinuxsshwindows

I have a ~35 GB file on a remote Linux Ubuntu server. Locally, I am running Windows XP, so I am connecting to the remote Linux server using SSH (specifically, I am using a Windows program called SSH Secure Shell Client version 3.3.2).

Although my broadband internet connection is quite good, my download of the large file often fails with a Connection Lost error message. I am not sure, but I think that it fails because perhaps my internet connection goes out for a second or two every several hours. Since the file is so large, downloading it may take 4.5 to 5 hours, and perhaps the internet connection goes out for a second or two during that long time. I think this because I have successfully downloaded files of this size using the same internet connection and the same SSH software on the same computer. In other words, sometimes I get lucky and the download finishes before the internet connection drops for a second.

Is there any way that I can download the file in an intelligent way — whereby the operating system or software "knows" where it left off and can resume from the last point if a break in the internet connection occurs?

Perhaps it is possible to download the file in sections? Although I do not know if I can conveniently split my file into multiple files — I think this would be very difficult, since the file is binary and is not human-readable.

As it is now, if the entire ~35 GB file download doesn't finish before the break in the connection, then I have to start the download over and overwrite the ~5-20 GB chunk that was downloaded locally so far.

Do you have any advice? Thanks.

Best Answer

My 'proper' solution would be to find and fix what causes the problem, but these things might work as workarounds:

  1. use split ( man split -- split a file into pieces). It seems to be installed on most unix systems.
  2. Split the file via dd (dd if=inputfilename of=file_part0 skip=0 bs=500MB). (repeat with skip=1 and a different file name.)
  3. Use a program which can resume downloading. FTP would work (but eww. Plain text passwords, separate control and data connection which do not play nice with most firewalls)
  4. Cheat and move the file to a web directory. Most brosers supports resuming a download,
Related Question