you can use
curl -L -O --retry 999 --retry-max-time 0 -C - http://url
-C -
: resume where the previous download left off
--retry 999
: retrying so many times
--retry-max-time 0
: prevent
it from timing out on retrying
or
curl -L -o 'filename' -C - http://url
Update
export ec=18; while [ $ec -eq 18 ]; do /usr/bin/curl -O -C - "http://www.example.com/big-archive.zip"; export ec=$?; done
Explanation :
The exit code curl chucks when a download is interrupted is 18, and $?
gives you the exit code of the last command in bash. So, while the exit code is 18, keep trying to download the file, maintaining the filename (-O)
.
My personal preference would be to use wget
which has been built specifically for this use case. From the man page:
Wget has been designed for robustness over slow or unstable network connections;
if a download fails due to a network problem, it will keep retrying until the
whole file has been retrieved. If the server supports regetting, it will
instruct the server to continue the download from where it left off.
wget
is available for almost all Linux distributions - it probably is already installed on yours. Just use wget
to download the file, it will re-establish the network connection until the file is completely transferred.
I finally found a solution, and there's no extension necessary in Firefox.
After getting the URL you received, open a new tab in Firefox and open the Developer Tools (F12). Switch to the Network tab and enter the URL for the download. Hit cancel when it prompts you to download.
There will be a GET request now under Network; right-click it and select Copy > Copy as cURL
. Paste this massive string into your terminal and add -o [/path/filename]
to actually save the file somewhere instead of dumping it to STDOUT. That's it!
In a Chromium based browser, On the Microsoft Download page after it has created the "32-bit Download" and "64-bit Download button". Open the Developer Tools, then switch to the Network tab (it should be empty). click the 32 or 64 bit download button. It should bring up a save as dialog, cancel it. Right mouse click on the new url listed in the Network tab, Copy -> Copy as Curl
Then paste it into the terminal windows, and append the -o [/path/filename]
value. (ex -o Win10_20H2_v2_English.iso
) and it should start downloading.
Best Answer
Yes both wget and curl support limiting your download rate. Both options are directly mentioned in the man page.
curl
E.g:
curl --limit-rate 423K
wget
E.g:
wget --limit-rate=423k