Wget – How to Create a Local Copy of a Website Section on OSX

mirrorrecursivewget

This question follows from: How do I create a local copy of a complete website section from OSX using curl?

After discovering OSX's native curl wouldn't do this task I downloaded wget from here: http://www.techtach.org/wget-prebuilt-binary-for-mac-osx-lion

But performing:

./wget -r -l 0 https://ccrma.stanford.edu/~jos/mdft/

takes hours and installs a ton of other stuff I didn't want that ISN'T contained in this folder:

http://cl.ly/ENKr

Moreover opening a particular page, many of the images are missing:

http://cl.ly/ELXG

This may be because I aborted the transfer after a few hours(!)

How do I do this properly?

Best Answer

try adding:

--no-parent

"Do not ever ascend to the parent directory when retrieving recursively. This is a useful option, since it guarantees that only the files below a certain hierarchy will be downloaded."

In my experience it also prevents downloading from other sites.

Related Question