Super User is a question and answer site for computer enthusiasts and power users. Join them; it only takes a minute:

Sign up
Here's how it works:
  1. Anybody can ask a question
  2. Anybody can answer
  3. The best answers are voted up and rise to the top

I have a very unstable internet connection, and sometimes have to download files as large as 200 MB.

The problem is that the speed frequently drops and sits at --, -K/s and the process remains alive. I thought just to send some KILL signals to the process, but as I read in the wget manual about signals it doesn't help.

How can I force wget to reinitialize itself and pick the download up where it left off after the connection drops and comes back up again?

I would like to leave wget running, and when I come back, I want to see it downloading, and not waiting with speed --,-K/s.

share|improve this question

In order to avoid the --, -K/s situations you can use --read-timeout=seconds. This will timeout the connection after the amount of seconds.

If you need to go beyond that you can use this setup

wget --retry-connrefused --waitretry=1 --read-timeout=20 --timeout=15 -t 0

This will retry refused connections and similar fatal errors (--retry-connrefused), it will wait 1 second before next retry (--waitretry), it will wait a maximum of 20 seconds in case no data is received and then try again (--read-timeout), it will wait max 15 seconds before the initial connection times out (--timeout) and finally it will retry an infinite number of times (-t 0).

You might also want to put this in a while loop in order to avoid local network failure and similar. In this case you also need to add --continue in order to continue the download where you left off. The following works well in Bash

while [ 1 ]; do
    wget --retry-connrefused --waitretry=1 --read-timeout=20 --timeout=15 -t 0 --continue
    if [ $? = 0 ]; then break; fi; # check return value, break if successful (0)
    sleep 1s;
done;

As a bonus tip you can also use --no-dns-cache in case the host balances your request between multiple servers by DNS.

Disclaimer: I do not recommend using this since it will spam the host in case the connection is unstable and it's kind of unwise to leave it unmonitored. However this is what you want in case you really need to download something and your connection doesn't work adequately.

share|improve this answer

--tries=number

This option set number of retries to number. Specify 0 or ‘inf’ for infinite retrying.

wget --tries=70 http://example.com/myfile.zip should do it.

The default is to retry 20 times, with the exception of fatal errors like “connection refused” or “not found” (404), which are not retried.

share|improve this answer

Would this help? On askubuntu.com, I found a question very similar to the one you are asking. Here is the link to it: http://askubuntu.com/questions/72663/how-to-make-wget-retry-download-if-speed-goes-below-certain-threshold

share|improve this answer

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.