I want to download a website from an URL, to view it locally, more exactly:
- Download one single html page (no other linked html pages) and everything needed to display it (css, images, etc.)
- Also download all directly linked files of type
pdf
andzip
. - And correct all links to them, so the links do work locally.
- The other links (for example to html files) should be kept untouched.
I'm open to all linux-based tools (macports
support would be nice), using wget
didn't work out for me so far.
Edit: wget -E -H -k -K -p
is close to what I want, but how do I include pdf and zip files?
(Source: Stackoverflow)
Best Answer
HTTrack (homepage) can mirror sites for off-line viewing with a rather fine grained options as to what to download and what not. It is also able to resume interrupted downloads.