I have a folder with folder view on the web (http://example.com/folder1/folder2/)
/folder2 has multiple folders with pdf files within them. I want download to my server via ssh all the content of /folder2 including all the sub folders and files using wget. I've tried the following but i keep getting only an index.html and robots.txt file.
[root@myserver downloads]# wget -r --no-parent --reject "index.html*" http://www.example.com/folder1/folder2/
--2015-08-07 07:46:36-- http://www.example.com/folder1/folder2/
Resolving www.example.com... 192.168.1.1
Connecting to www.example.com|192.168.1.1|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: `www.example.com/folder1/folder2/index.html'
[ <=> ] 4,874,325 138K/s in 37s
2015-08-07 07:47:42 (128 KB/s) - `www.example.com/folder1/folder2/index.html' saved [4874325]
Loading robots.txt; please ignore errors.
--2015-08-07 07:47:42-- http://www.example.com/robots.txt
Connecting to www.example.com|192.168.1.1|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 26 [text/plain]
Saving to: `www.example.com/robots.txt'
100%[======================================>] 26 --.-K/s in 0s
2015-08-07 07:47:42 (1.42 MB/s) - `www.example.com/robots.txt' saved [26/26]
Removing www.example.com/folder1/folder2/index.html since it should be rejected.
FINISHED --2015-08-07 07:47:42--
Downloaded: 2 files, 4.6M in 37s (128 KB/s)
[root@myserver downloads]#
Other commands that i've tried with similar FAILED results:
wget -m -p -E -k -K -np http://example.com/folder1/folder2/
wget -r http://example.com/folder1/folder2/ -nd -P /downloads -A PDF
Best Answer
I suppose you want to download via
wget
and SSH is not the issue here.Solution by Attilio:
Edit
The above solution is well suited for mirroring websites; sorry I was a bit too quick to answer and it is not optimal for mirroring PDF.
-m
,--mirror
: download recursively everything-nH
,--no-host-directories
: do not put the data inside a directory named with the host name--cut-dirs=1
: skip the first directory when creating the local hierarchy-np
,--no-parent
: do not fetch the parents!-R
,--reject 'index.*'
: do not save files named like "index.*"Might be useful:
-e robots=off
to tell wget to ignore yourrobots.txt
.Example
Alternative
This is not what you asked, but I personally like to use
lftp
for that: