How to use wget –wait option when using input from a file

wget

I have a text file with a list of urls I want to download. I want wget to wait before downloading the next url. I'm using

cat urls.txt | xargs -n1 -i wget --wait=30 {}

However there is no wait between urls. Can I use wget's –wait option for this? Other alternatives?

Best Answer

It's because wget is newely invoked for each url. Use the -i option to feed a list of URLs to wget:

$ wget -i urls.txt --wait=30

From the manual:

-i file
--input-file=file

Read URLs from a local or external file. If - is specified as file,
URLs are read from the standard input. (Use ./- to read from a file
literally named -.)  If this function is used, no URLs need be present
on the command line. If there are URLs both on the command line and in
an input file, those on the command lines will be the first ones to be
retrieved. If --force-html is not specified, then file should consist
of a series of URLs, one per line.

However, if you specify --force-html, the document will be regarded as
html. In that case you may have problems with relative links, which
you can solve either by adding "<base href="url">" to the documents or
by specifying --base=url on the command line.

If the file is an external one, the document will be automatically
treated as html if the Content-Type matches text/html. Furthermore,
the file's location will be implicitly used as base href if none was
specified.
Related Question