How to Download Linked Images from a Website Using Wget

downloadswget

Is it possible to download all .jpg and .png files linked in a web? I want to download the images from each post of each thread of [this forum][1] containing a link. For example [this post][2] contains a link to [this file][3].

I've tried with wget:

  wget -r -np http://www.mtgsalvation.com/forums/creativity/artwork/340782-official-digital-rendering-thread? 

and it copied all the html files of that thread. Although I don't know why it jumps from ...thread?comment=336 to ...thread?comment=3232, when it was going one by one until comment 336.

Best Answer

Try with this command:

wget -P path/where/save/result -A jpg,png -r http://www.mtgsalvation.com/forums/creativity/artwork/

According to wget man page:

    -A acclist --accept acclist
        Specify comma-separated lists of file name suffixes or patterns to
        accept or reject (@pxref{Types of Files} for more details).
    -P prefix
        Set directory prefix to prefix.  The directory prefix is the direc‐
        tory where all other files and subdirectories will be saved to,
        i.e. the top of the retrieval tree.  The default is . (the current
        directory).
    -r
    --recursive
        Turn on recursive retrieving.

Try this:

    mkdir wgetDir
    wget -P wgetDir http://www.mtgsalvation.com/forums/creativity/artwork/340782-official-digital-rendering-thread?page=145

This command will get html page and put it in wgetDir. When I tried this command I found this file:

    340782-official-digital-rendering-thread?page=145

then, I tried this command:

    wget -P wgetDir -A png,jpg,jpeg,gif -nd --force-html -r -i "wgetDir/340782-official-digital-rendering-thread?page=145"

and it downloads images. So, it seems to work, although I do not know if these pictures are the ones you want to download.