Linux – Download directory & subdirectories via wget

bashlinuxsshwget

I have a folder with folder view on the web (http://example.com/folder1/folder2/)

/folder2 has multiple folders with pdf files within them. I want download to my server via ssh all the content of /folder2 including all the sub folders and files using wget. I've tried the following but i keep getting only an index.html and robots.txt file.

[root@myserver downloads]# wget -r --no-parent --reject "index.html*" http://www.example.com/folder1/folder2/
--2015-08-07 07:46:36--  http://www.example.com/folder1/folder2/
Resolving www.example.com... 192.168.1.1
Connecting to www.example.com|192.168.1.1|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: `www.example.com/folder1/folder2/index.html'

    [         <=>                           ] 4,874,325    138K/s   in 37s     

2015-08-07 07:47:42 (128 KB/s) -     `www.example.com/folder1/folder2/index.html' saved [4874325]

Loading robots.txt; please ignore errors.
--2015-08-07 07:47:42--  http://www.example.com/robots.txt
Connecting to www.example.com|192.168.1.1|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 26 [text/plain]
Saving to: `www.example.com/robots.txt'

100%[======================================>] 26          --.-K/s   in 0s      

2015-08-07 07:47:42 (1.42 MB/s) - `www.example.com/robots.txt' saved [26/26]

Removing www.example.com/folder1/folder2/index.html since it should be rejected.

FINISHED --2015-08-07 07:47:42--
Downloaded: 2 files, 4.6M in 37s (128 KB/s)
[root@myserver downloads]# 

Other commands that i've tried with similar FAILED results:

wget -m -p -E -k -K -np http://example.com/folder1/folder2/

wget -r http://example.com/folder1/folder2/ -nd -P /downloads -A PDF

Best Answer

I want download to my server via ssh all the content of /folder2 including all the sub folders and files using wget.

I suppose you want to download via wget and SSH is not the issue here.

Solution by Attilio:

wget --mirror --page-requisites --adjust-extension --no-parent --convert-links \
    --directory-prefix=folder2 http://example.com/folder1/folder2/

Edit

The above solution is well suited for mirroring websites; sorry I was a bit too quick to answer and it is not optimal for mirroring PDF.

wget -m -nH --cut-dirs=1 -np -R 'index.*' http://example.com/folder1/folder2/
  • -m, --mirror: download recursively everything
  • -nH, --no-host-directories: do not put the data inside a directory named with the host name
  • --cut-dirs=1: skip the first directory when creating the local hierarchy
  • -np, --no-parent: do not fetch the parents!
  • -R, --reject 'index.*': do not save files named like "index.*"

Might be useful: -e robots=off to tell wget to ignore your robots.txt.

Example

$ wget -m -nH --cut-dirs=4 -np --reject 'index.*' \
 http://ftp.lip6.fr/pub/linux/distributions/slackware/slackware64-current/source/a/bin/
$ tree
.
└── slackware64-current/
    └── source/
        └── a/
            └── bin/
                ├── banners.tar.gz
                ├── bin.SlackBuild
                ├── debianutils_2.7.dsc
                ├── debianutils_2.7.tar.gz
                ├── fbset-2.1.tar.gz
                ├── scripts/
                │   ├── diskcopy.gz
                │   └── xx.gz
                ├── slack-desc
                └── todos.tar.gz

Alternative

This is not what you asked, but I personally like to use lftp for that:

lftp -c "open http://example.com/folder1/; mirror folder2"
Related Question