I would like to create a short but sweet script for wget to use a .list file. The catch is that I'd like to set directories that these files go into.
Example:
file: url.list
[group 1]
http://www.somehost.com/files/tool.7z
http://www.someotherhost.com/files/icon36.png
[group 2]
http://www.idunno.net/other-tool.tar.gz
http://265.265.265.265/config.ini
http://www.myownsite.com/tools/script-to-run-tool.cmd
eof
([group 1] and [group 2] are just here for readability, they are NOT in my real list file)
(yeah I know 265 isn't real, that's why it's an example)
command (currently using which cannot parse folders)
wget –continue –timestamping –content-disposition -i url.list
Of course, this currently downloads all 5 files to the same directory.
my question is, is there a way to tell wget to use a different folder for group 1, and for group 2, in my case i'd like this to grab several tools that I use at work, i have a separate script in windows that creates a winpe usb key and injects all tools in these directories to the key.
So my ultimate question, can this be done super easy, or does it require me to use a full bash script to grab them and create the folders for them to go in and move them there? (using -o in wget screws with my timestamping, and timestamps are mission critical)
in theory when this finishes, i would like to have a fresh copy of (psuedo names):
tools/cool-tool/tool.7z
tools/cool-tool/icon36.png
tools/special-tool/other-tool.tar.gz
tools/special-tool/config.ini
tools/special-tool/script-to-run-tool.cmd
Best Answer
Create a list that has the URL and target directory on the same line:
Then, use a
bash
loop to read the file and feed it to wget: