Lower Wait Time for Repository Updates – How to Guide

aptupdates

When doing a aptitude update / apt-get update or using the Update Manager to update sometimes I get to a repository link that takes too long. The percent does not end and it takes quite a while before it ignores it.

How can I lower the time so that if a particular repository takes more than 10 seconds to connect or finish it should ignore it and move the following ones. Here is an image explaining the problem:

enter image description here

It is trying to connect to archive.ubuntu.com but since it is taking too long it just sits there for at least 3 to 5 minutes (Haven't measured the time) and then it shows as ignored and moves to the following. I wish to change that to seconds instead of minutes.

Best Answer

How can I lower the time so that if a particular repository takes more than 10 seconds to connect or finish it should ignore it and move the following ones?

Mirrors are one option, as @adempewolff explained. Let me give you a direct answer though:

Setting apt-get connection timeouts

You can control these timeouts via the following apt.conf options:

  Acquire::http::Timeout "10";
  Acquire::ftp::Timeout "10";
  

Note that this only applies to connection timeouts, NOT "finish time" timeouts, i.e. if it connects within 10 seconds, it will continue to download a 100MB package even if it's at 1 KB/second :)

To implement these options, simply create a conf file in /etc/apt/apt.conf.d; suppose we call it 99timeout.

  • Press Alt+F2, type gksudo gedit /etc/apt/apt.conf.d/99timeout
  • Type/paste the above lines, with your choice of timeout in seconds
  • Save and exit.
  • Now try sudo apt-get update

And the terminal-addict's "find best server" hack!

Expanded and moved as an answer to this more appropriate question


Additional apt-get conf options that you can try to tweak

  • Acquire::Queue-Mode: Queuing mode; Queue-Mode can be one of host or access which determines how APT parallelizes outgoing connections. host means that one connection per target host will be opened, access means that one connection per URI type will be opened.

  • Acquire::Retries: Number of retries to perform. If this is non-zero APT will retry failed files the given number of times.

  • Acquire::http::Dl-Limit: accepts integer values in kilobytes, to throttle download speed and not slow down your browsing/email/etc. when updating. The default value is 0 which deactivates the limit and tries uses as much as possible of the bandwidth. If enabled, it will disable apt-get's parallel downloading feature.

  • Dig through man apt.conf if you think something else might help!