Save a single web page (with background images) with Wget

command linedownloadmirroringwebwget

I want to use Wget to save single web pages (not recursively, not whole sites) for reference. Much like Firefox's "Web Page, complete".

My first problem is: I can't get Wget to save background images specified in the CSS. Even if it did save the background image files I don't think –convert-links would convert the background-image URLs in the CSS file to point to the locally saved background images.
Firefox has the same problem.

My second problem is: If there are images on the page I want to save that are hosted on another server (like ads) these wont be included. –span-hosts doesn't seem to solve that problem with the line below.

I'm using:
wget --no-parent --timestamping --convert-links --page-requisites --no-directories --no-host-directories -erobots=off http://domain.tld/webpage.html

Best Answer

From the Wget man page:

Actually, to download a single page and all its requisites (even if they exist on separate websites), and make sure the lot displays properly locally, this author likes to use a few options in addition to ‘-p’:

wget -E -H -k -K -p http://www.example.com/

Also in case robots.txt is disallowing you add -e robots=off

Related Question