How to download all files linked on a website using wget

wget

I use the following command to get all PDFs from a website:

wget --no-directories --content-disposition --restrict-file-names=nocontrol -e robots=off -A.pdf -r \     url

However, this only downloads .pdf files. How can I extend this command to also download .ppt and .doc files?

Best Answer

wget's -A option takes a comma-separated accept LIST, not just a single item.

wget --no-directories --content-disposition --restrict-file-names=nocontrol \
    -e robots=off -A.pdf,.ppt,.doc -r url

See man wget and search for -A for more details.

Related Question