From assela Pathirana
Jump to navigationJump to search


There is a nice little utility called wget that is useful to downlad files directly from command line. For example if you want to download the google image that appear in google seach page (http://www.google.com/intl/en/images/logo.gif), do the following $ wget http://www.google.com/intl/en/images/logo.gif

and you will see some progress messages like the ones below

--23:08:07--  http://www.google.com/intl/en/images/logo.gif
           => `logo.gif'
Resolving www.google.com...,
Connecting to www.google.com||:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 8,558 (8.4K) [image/gif]

100%[====================================>] 8,558         --.--K/s             

23:08:08 (148.57 KB/s) - `logo.gif' saved [8558/8558]

and the file will be saved on the current directory. This is particularly useful, when you want a program to automatically download some stuff to the computer.

Advanced stuff

Man page of wget gives more information on how to use it creatively. For example to download the google's image and save it as google.gif, instead of logo.gif wget http://www.google.com/intl/en/images/logo.gif --output-document=google.gif Or if you want to download a whole site

wget --wait=20 --limit-rate=20K -r -p -U Mozilla http://sitename.com

(For large sites, this can take a long time and many fill up a decent chunk of your hard drive. But it will work.)