</img> When there is the need to download files like pictures, PDFs, or any other type of file, it can be done by right clicking on the link and choosing to save the given file, but that is tiring.

A better approach is to use wget

It is usually installed in all Linux distros, but if not we can install it.

Debian / Ubuntu

apt-get install wget

Fedora / CentOS

yum install wget

Sabayon

emerge wget

Now lets explore some of its features.

If there is the need to download a single page.

wget http://www.site.com/file.pdf

But if there is the need to download the entire site, use the recursive option.

wget -r http://www.site.com

Now what to do if only certain file types are needed? Use the -A option

To download only pdf and jpg use.

wget -r -A pdf,jpg http://www.site.com

Well now suppose that there is the need to follow external links, usually wget does not do this, here we can use -H option.

wget -r -H -A pdf,jpg http://www.site.com

This is a little bit dangerous as it could end up downloading a lot much files that the ones needed, so we could limit the sites to follow, we will use -D for this.

wget -r -H -A pdf,jpg -Dfiles.site.com http://www.site.com

By default wget will follow 5 levels when using -r option, we can change this behaviour with the -l option.

wget -r -l 2 http://www.site.com

This way only two levels depth will be followed.