September 30, 2014
 
 
RSSRSS feed

Moving Files In Linux - page 5

The Low Security Family

  • May 22, 2003
  • By Dee-Ann LeBlanc

GNU's wget utility is a non-interactive download tool, meaning that it has no command line features to match lftp's functionality. You have access to FTP, HTTP, HTTPS, and proxied HTTP files using this program, but you have to know ahead of time what file you're trying to download, and where it is in the system's path.

This command expects the file's location and path in a URL format. Say that I want to write a script that pulls data out of a Web page. I can easily grab that page using wget so I can work with its source. For example, I can download the default page for the Canadian Broadcasting Corporation's (CBC) with:

wget http://www.cbc.ca/

If there are a list of pages, files, and so on that I want to grab for the script, I can list one URL-formatted item per line within a file. For example, if the file was ~/bin/getme, I would use:

wget -i ~/bin/data/getme

I could even tell wget to grab all of the URLs listed in a particular HTML file. If the default file I downloaded from the CBC was index.html.1 and it was saved in my home directory, then I would use the following to have wget grab every URL referenced in this file:

wget -i index.html.1 -F

Notice the need to keep the flag's option with the flag. This command will not work if you use -iF .

wget has a number of useful features, including separate sets of options for FTP and HTTP connections. Taking the time to get more familiar with this tool is well worth your efforts.

Sitemap | Contact Us