Most of the system administrators and even some of the standard users prefer the command line for their everyday tasks like editing, and deleting files, creating and removing users, finding IP address etc. One of the reasons for this is because the command line is faster and uses fewer resources. One other thing that users frequently do is download a file. They can also do it easily and more quickly using the command line. Wget and curl are the command line utilities that let you download files from the command line.
In this post, we will describe how to use the wget and curl utilities to download a file on Ubuntu from the command line.
Note: We will be describing the procedure on Ubuntu 20.04 system.
Download Files Using Wget
Wget is a command-line tool used for downloading files from the web. Using wget, you can download a single HTML file or an entire website. It supports downloading of files using HTTP, HTTPS, and FTP protocols. It comes installed on almost all Linux OS. However, if you cannot find it in your system or it is mistakenly removed, you can install it as follows:
$ sudo apt install wget
The basic syntax of the wget command is as follows:
$ wget [option]… [URL]…
Download a file from command line
To download a file from the command line, simply type wget followed by the URL of the file you want to download. Let’s say to download “vnstat-2.6.tar.gz” a network traffic monitor package from a website, the command would be:
$ wget https://humdi.net/vnstat/vnstat-2.6.tar.gz
Wget will start downloading the file and you will see the progress. The file will be saved to your Terminal’s current directory.
Resume a partially downloaded file
If a download is stopped due to any reason or you have manually stopped it by pressing the Ctrl+C, you can resume it using the wget command -c option. This option allows you to continue a partially downloaded file where you left it off.
$ wget -c <URL>
Turn off verbose output
By default, the wget displays the verbose output showing all the details of the download process. If you want, you can limit this output using the wget -nv option.
$wget -nv <URL>
This option only displays the basic information of the download process.
To completely turn off the verbose output, use the -q option:
$ wget -q <URL>
Download multiple files
To download multiple files, type wget followed by the URLs of all the files.
$ wget <URL1> <URL2>
The wget command will download both the files and save them to your current Terminal directory.
Another way to download multiple files is by using the wget -i option. Let’s say you need to download a large number of files. All you need is to create a text file and list all URLs in this file (one URL per line). Then type wget followed by -i option and the file name containing a list of URLs:
$ wget -i <filename>
Note: In the above output, we have used the -nv option to turn off the verbose output.
Download Files Using Curl
Curl is a command-line tool used for downloading and uploading files to or from the server. It supports over 20 protocols including FTP, HTTP, HTTPS, TFTP, IMAP, LDAP, etc
Curl comes installed on almost all Linux OS. However, if you cannot find it in your system or it is mistakenly removed, you can install it as follows:
$ sudo apt install curl
The basic syntax of the curl command is as follows:
$ curl [option]… [URL]…
Basic curl command usage
The basic use of the curl command is to download a single file or a content of a webpage. Let’s say to download a webpage “index.html”, we would type curl followed by the URL of the webpage:
$ curl <URL>
This command will download the specified file in your current directory.
After running the above curl command, you will see the HTML content displayed on the screen similar to the following:
Save the content of the page to a file
You can download and save the content of a page to a file instead of displaying it on the screen. To save the file to your system, use the curl command -O or -o option. The -O option saves the file with the same name as the file in the remote location. While the -o option allows saving the file with a different name.
Using the -O option
With the -O option, you do not need to specify the file name. It will save the file in your system with the name of the file in the remote location.
$ curl -O <URL>
For instance, the below command will save the file with the name “index.html”:
$ curl -O https://www.cisco.com/c/en/us/support/switches/index.html
Using the -o option
With the -o option, you can specify a file name of your choice.
$ curl -o filename <URL>
For instance, the following command will save the file with the name “switches.html”:
$ curl -o switches.html https://www.cisco.com/c/en/us/support/switches/index.html
Run curl silently
If you do not want to view any progress bar or any error message during the curl download process, you can make it silent using the -s option as follows:
$ curl -s <URL>
Download multiple files
To download multiple files, type curl followed by the URLs of all the files:
$ curl -O [URL1] -O [URL2] -O [URL3]….
This command will save all the files to your current Terminal directory.
If there are too many URLs that you need to download, then create a text file and list the URLs in it.
Then to download all the URLs listed in the file, use the following command:
$ xargs -n 1 curl -O < filename
You can then use the ls command to confirm if all the files have been downloaded.
Resume a partially downloaded file
If a download is stopped due to any reason or you have manually stopped it by pressing the Ctrl+C, you can resume it using the curl command “-C –” option. This option allows you to continue a partially downloaded file where you left it off.
$ curl -C - <URL>
Both wget and curl are the free and open-source command-line utilities used for the non-interactive downloading of files. Remember, although both the utilities can download files from the web; they do differ a lot in terms of functionalities. You can visit wget and curl man pages for a detailed overview of what these utilities are capable of.
Karim Buzdar holds a degree in telecommunication engineering and holds several sysadmin certifications including CCNA RS, SCP, and ACE. As an IT engineer and technical author, he writes for various websites.