wget is a command line tool used to download files, or complete webpages, it is a great utility with lots of options, as you can see if you read the <a href=http://www.go2linux.org/wget-man-page-usage” target=”blank”>wget man page</a>

Some months ago, I have written about how to download files with wget, now I want to add some other tips to those already explained that day.

Resume a download

If you need to stop a current download, and pretend to resume it later, you should use the -c option i.e.:

wget http://some.server.com/file -c

Traffic shaping, or limiting the speed of the download

I really use this feature a lot, as my home ADSL is not as big as I would like, I have to use the speed limiter, when downloading ISOs, otherwise I just can not continue working, to limit the speed of the download use the –limit-rate option.

wget http://some.server.com/file --limit-rate=20k

That line is going to limit the download speed to 20 Kbytes per second, or 160 kbps.

Let wget working after log out from ssh connection

I usually connect through ssh to my office (better ADSL than my home’s) and download the files there over the night, the next day I bring them home.

So, to make wget continue working after the log out, because I do not want to let my home PC on all night long, so the command is:

wget -b http://some.server.com/file

Logging the output to a file

This is useful when you are working with wget in the background, to be able to know what was wrong if anything goes wrong, use the -o option and specify a file to store the logs.

wget http://some.server.com/file -o $HOME/log.txt

Of course you can combine the options, and put something like this:

wget -b -c http://some.server.com/file --limit-rate=20K -o $HOME/log.txt