Ubuntu – How to download a whole FTP site using command line

downloadsftpwget

I need to download an FTP site (a web site source code actually) containing a very high quantity of relatively small files (downloading it with FileZilla took more than a day, but I believe it could be downloaded much faster if many files were downloaded at once). Unfortunately there is no SSH access to the site and no way to archive the files on the server side.

So at least the question is how to download a whole FTP site into a folder using command line tools. A better it'd be if the download could be parallelized by downloading many files simultaneously instead of one by one.

Best Answer

Try the following:

wget -r ftp://username:password@myserver.com

this will go to 5 depth levels; to increase, add the -l option

Some servers don't like this behavior, and you risk getting blacklisted because of the load on the server. To avoid this, use the -w option to wait a specified number of seconds.

More info (as well as caveats) can be found here:

http://www.gnu.org/software/wget/manual/wget.html#Recursive-Download

http://linuxreviews.org/quicktips/wget/

--user and --password arguments are useful for usernames/passwords with special characters.