CURL command Tutorial in Linux with Example Usage

Sarath Pillai's picture
Curl Command Examples and usage

Transferring data from one place to another is one of the main task done using computers connected to a network. There are so many GUI tools out there to send and receive data, but when you are working on a console, only equipped with command line functionality, using curl is inevitable. A less known fact is that curl can work with a wide range of protocols and can solve most of your scripting tasks with ease. Before getting into details of curl use cases and examples, let's see who is behind its development.

 

Haxx is a team of developer consultants offering solutions to programming problems. They offer solutions in the field of Embedded programming, Unix/Linux, Network, Device Drivers, Perl scripts etc. One of the co-founder of Haxx gifted the open-source community with a tool called CURL. The man behind its development is none other than Daniel Stenberg(who is currently a Senior Network Engineer @ Mozilla.)

 

Reference: Author of CURL and Libcurl

Reference: Haxx Community

 

CURL comes by default installed in most of the distributions. If you do not have curl tool installed, then its a single apt-get(apt-get install curl) or yum(yum install curl) command away. Installing curl will install the curl command line tool as well as libcurl library. On an Ubuntu system the package names are as follows.

 

  1. curl_7.22.0-3ubuntu4.7_amd64.deb
  2. libcurl3_7.22.0-3ubuntu4.7_amd64.deb

 

What is curl and what makes it a superb command line tool?

CURL is simply awesome because of the following reasons...

  1. CURL is an easy to use command line tool to send and receive files, and it supports almost all major protocols(DICT, FILE, FTP, FTPS, GOPHER, HTTP, HTTPS,  IMAP, IMAPS,  LDAP,  LDAPS,  POP3, POP3S, RTMP, RTSP, SCP, SFTP, SMTP, SMTPS, TELNET and TFTP) in use.
  2. Can be used inside your shell scripts with ease
  3. Supports features like pause and resume of downloads
  4. It has around 120 command line options for various tasks
  5. It runs on all major operating systems(More than 40+ Operating systems)
  6. Supports cookies, forms and SSL
  7. Both curl command line tool and libcurl library are open source, so they can be used in any of your programs
  8. It supports configuration files
  9. Multiple upload with a single command
  10. Progress bar, rate limiting, and download time details
  11. ipv6 support

And much more. Refer: http://curl.haxx.se/docs/features.html For more feature sets and other details....

 

Basic CURL command Usage

Let's get started by looking at some very basic examples of using curl in Linux. By default curl will show you the entire output on your console. A nice feature of curl is to guess the protocol based on the URL host name you use. For example if you give a URL named ftp.example.com (CURL will use FTP protocol to fetch data). But in case curl cannot guess the Protocol, then it will default to HTTP.

 

root@ubuntu1:~# curl example.com

 

 

The above command will show the entire HTTP content on that example.com URL. Here also curl tried to guess the protocol. But as it didn't find any, it defaulted to HTTP.

 

Under normal cases, you will have to specify the URI which includes the protocol information, so that curl will use your desired protocol to fetch data. For example, protocol://prefix

The previously shown example command will not save the html output(It will show you the output in the console itself.). If you want to save the output to a file, you can either use redirection method in linux, or use -o option in curl.

 

root@ubuntu1:~# curl example.com > example.html
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  1270  100  1270    0     0   2645      0 --:--:-- --:--:-- --:--:--  5852

 

 

The above command will save the output to example.html file. You can alternatively use -o option in curl as shown below.

 

root@ubuntu1:~# curl -o example.html example.com
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  1270  100  1270    0     0   2617      0 --:--:-- --:--:-- --:--:--  5799

 

 

Both of these methods will show you the download details like transfer rate, time, bytes etc.

 

Downloading Multiple Files using single CURL command

 

CURL command can be used to download multiple files at the same time, using -O option. An important fact to note here is that, curl will use the same TCP connection to download multiple files. This is done to improve the download speed. Establishing TCP connection to a target server requires multiple processes. Its called as three way handshake. To reduce the time involved in doing a three way handshake, curl will try to use the same connection for all downloads from the same server issued by the single command.

 

Related: What is TCP three-way Handshake

 

root@ubuntu1:~# curl -O libiconv-1.14.tar.gz http://ftp.gnu.org/pub/gnu/libiconv/libiconv-1.10.tar.gz -O http://ftp.gnu.org/pub/gnu/libiconv/libiconv-1.12.tar.gz -O http://ftp.gnu.org/pub/gnu/libiconv/libiconv-1.13.tar.gz

 

The above command will download three different version of the package libiconv from ftp.gnu.org(Please note the fact that all the three URL's are different.). Another advantage of using -O option is that it will save the output to a file with the exact same name as the URL file name. In our example, it will save all the three files with file name as libiconv-1.10.tar.gz, libiconv-1.12.tar.gz,libiconv-1.13.tar.gz.

 

Following HTTP redirection Using CURL

Both 302 and 301 are widely accepted status codes used for HTTP redirection. 302 is normally used for temporary redirection and 301 is moved permanently. Its quite normal to encounter such URLs while using CURL. HTTP is designed to reply with a new URL, from where the client can fetch the data from.

 

Google does this kind of redirection, if you try to do a CURL to google.com (instead of www.google.com.). The HTTP response will contain two things. It will contain the status code 301 or 302, and an alternative URL which the client can use. Web browsers will automatically redirect you to the alternative URL, when they encounter 301, or 302. But in case of curl, it will show you the exact message returned by the server. Let me show you this..

 

root@ubuntu1:~# curl google.com
<HTML><HEAD><meta http-equiv="content-type" content="text/html;charset=utf-8">
<TITLE>301 Moved</TITLE></HEAD><BODY>
<H1>301 Moved</H1>
The document has moved
<A HREF="http://www.google.com/">here</A>.
</BODY></HTML>

 

 

If you want CURL to redirect automatically to the new URL then you need to use the -L  option as shown below.

root@ubuntu1:~# curl -L google.com

 

 

Pause/Resume Downloads using Curl Command

 

Similar to any GUI download manager, you can pause and resume downloads using CURL. This can be achieved using -C option as shown below. Let's first start a download and then cancel it(with CTRL + C).

 

root@ubuntu1:~# curl -O http://ftp.gnu.org/pub/gnu/libiconv/libiconv-1.14.tar.gz
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0 4867k    0  1196    0     0   1631      0  0:50:56 --:--:--  0:50:56  5363

 

 

Now let's resume the download...

 

root@ubuntu1:~# curl -C - -O http://ftp.gnu.org/pub/gnu/libiconv/libiconv-1.14.tar.gz
** Resuming transfer from byte position 28672
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0 4839k    0 30339    0     0  21558      0  0:03:49  0:00:01  0:03:48 27185

 

On resuming the transfer, it clearly says that its resuming from byte 28672. This is because curl tried to fetch the bytes previously downloaded from the partially saved file libiconv-1.14.tar.gz, in the current directory. And on finding the partially downloaded file, it resumed from where it left last time.

 

See Complete Request And Response Headers with CURL

 

Commands in Linux usually comes with an option called as verbose. In verbose output, it shows you the complete information encountered by the command. Using verbose output in curl can help you see all headers (both request and response). Using Verbose is done by simply passing -v

 

root@ubuntu1:~# curl -v example.com

 

This will show you the complete headers that curl encounters while fulfilling the request. Headers include request headers sent, Response headers received etc.

 

Related: HTTP Request and Response Tutorial

 

Show only Response Headers in CURL

 

Sometimes you only want to see the response headers returned by the server, without seeing the actual response content. This can be used in scripts to verify the response status, response bytes, or response server type etc.

 

root@ubuntu1:~# curl -I example.com
HTTP/1.1 200 OK
Accept-Ranges: bytes
Cache-Control: max-age=604800
Content-Type: text/html
Date: Mon, 17 Mar 2014 12:42:14 GMT
Etag: "359670651"
Expires: Mon, 24 Mar 2014 12:42:14 GMT
Last-Modified: Fri, 09 Aug 2013 23:54:35 GMT
Server: ECS (iad/19AB)
X-Cache: HIT
x-ec-custom-error: 1
Content-Length: 1270

 

Use Proxy Server with CURL

 

If you want to send your request through a proxy server, then you can do that with -x option as shown below. Somtimes these proxy servers require authentication first. Let's see how to perform a curl request with proxy authentication.

 

root@ubuntu1:~# curl -x http://proxyserver:proxyport --proxy-user user:password -L http://example.com

 

In the above command, proxyserver is your proxy server host name or ip address, proxyport is your proxy server's port, and user:password is of course your proxy credentials. I have used -L option to follow all redirection that curl encounters(as seen before in Following HTTP redirection using CURL).

 

Ignore SSL Certificate Error with CURL

 

SSL certificates needs to be signed by an authorized certificate authority. Otherwise user agents like browser's will warn you and requires a user action like pressing a continue button. Normally curl will not continue the connection at all, if it finds an unknown ssl certificate. Although its a nice security feature, we do configure internal self signed certificate for our internal servers. In such cases, you need to accept the unknown certificate warning and continue with the request. This can be done with curl using the -k option as shown below.

 

root@ubuntu1:~# curl -k https://10.1.136.101:4848

 

Related: How SSL Works?

 

Modify User Agent In HTTP request

When a web browser sends an HTTP request, it includes its own details in the request. This detail allows the server to identify the browser details. The HTTP request header that includes this information is called as User-Agent. Basically server's can be configured to respond with different page layout for different user agent's. The user agent information send by the client will be visible in server logs.

 

A normal HTTP request send by a Firefox web browser will have a user agent that looks something like "Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9.2.3)"

Using curl you can modify user agent value to your required string, as shown below.

 

root@ubuntu1:~# curl -A "YOUR USER AGENT STRING GOES HEERE" http://example.com

 

Download Rate Limit with Curl

You can ask curl to limit download speed to your desired value. This is very helpful, when you do not want your curl command to consume much of the available bandwidth. This can be done using the below method.

 

root@ubuntu1:~# curl --limit-rate 100k -O http://ftp.gnu.org/pub/gnu/libiconv/libiconv-1.10.tar.gz

 

In the above command, k stands for kilo bytes. You can also use bytes by using B, megabytes by using m, gigabytes using G.

 

Using CURL for FTP download and Upload

As discussed in the beginning of this article, CURL can work with different types of protocol. A normal use can be to upload/download files from your ftp server, without actually having any ftp client software. This can be done by passing --user argument in curl. Let's see how to do this.

Download File from FTP server using CURL

root@ubuntu1:~# curl ftp://example.com/mydirectory/myfile.zip --user username:password -o myfile.zip

 

The above shown command will download the file myfile.zip from ftp://example.com/mydirectory, and save it as myfile.zip. But if you want to first see the directory structure of your FTP server, then you can do that by the following method.

 

root@ubuntu1:~# curl ftp://example.com --user username:password

Above command, will list the directory and files inside your users doc root.

Uploading files to FTP server using CURL

Uploading can be done using -T option in curl, and is very simple and straight forward. Its shown below.

 

root@ubuntu1:~# curl -T myfile.zip ftp://example.com/mydirectory/ --user username:password

 

Deleting files on FTP server

You can also delete files on your FTP server using curl command, as shown below.

root@ubuntu1:~# curl ftp://example.com/ -X 'DELE myfile.zip' --user username:password

 

Sending EMAIL's using CURL

If you read the list of protocol's supported by curl, you will find SMTP as well. So curl command can be used to send email's as well. An example is shown below.

 

root@ubuntu1:~# curl --url "smtps://smtp.example.com:465" --ssl-reqd   --mail-from "user@example.com" --mail-rcpt "friend@example.com"   --upload-file mailcontent.txt --user "user@example.com:password" --insecure

 

In the above command, replace smtps://smpt.example.com:465 with your SMTP server and port.

--mail-from: This field contains the from address that the receiver should see.

--mail-rcpt: This field contains TO address

--upload-file: The file provided here should contain your message as a content

--user: SMTP user@domain:password

--insecure option used is exactly same as using -k option we saw earlier, to ignore unknown SSL certificates.

 

 

Using CURL to send HTTP POST, PUT, DELETE requests

Apart from the above seen examples, curl can be used to send different types of HTTP request methods to your server. This becomes very much handy if you want to configure an application using REST API. I will show you some example's of sending REST API requests using CURL.

 

Sending POST request using CURL

If you use CURL to send request to a server without any request method option in the argument, it will by default use HTTP GET request. Which is a nice behavior. But using -X option in curl, you can send your required HTTP method to the server. Let's see POST request.

 

$curl -X POST -u  admin:admin http://example.com/myconfigs/status -Hcontent-type:application/xml -d @/home/user/file.xml

 

In the above example, PUT is the request method, -u is used to mention credentials to access that specific resource on the server, -H content-type is the type of content format that we will be sending to the server (can be xml, normal text etc). -d with @/home/user/file.xml indicates to send the content of file.xml to the server. This file will contain the configuration options with correct syntax that the URL http://example.com/myconfigs/status will accept.

 

Sending PUT request is exactly same as the above shown example of POST. Simply replace POST with PUT in the above example. If your web server does not accept these methods, you will get a 405 error as reply. HTTP 405 means that the server does not accept the HTTP method on that specific URL.

 

You can also get HTTP unsupported Media Type error as reply, if the server does not accept application/xml format. The HTTP status code for unsupported media type is 415.

 

Sending Cookies to Server using CURL

 

CURL can be used to send requests to the server using a previously got cookie. You can either use VALUE=DATA format or give a file name as parameter, so that curl can use cookie data from the file as input. You can also save the cookies you get from the server to a file using -c option as shown below.

 

root@ubuntu1:~# curl -b mycookies.txt http://example.com

 

 

Using the above command, you can send cookies saved in the file (mycookies.txt in our case) as input to the server. Alternatively you can also use the below method to send cookies.

 

root@ubuntu1:~# curl -b "name=value" http://example.com

 

 

If there is no = sign in the parameter passed to -b switch, then it is considered as a file, from where cookies should be read as input(basically the previous method we saw)

 

Sending Your own HTTP headers using CURL

HTTP headers contain details so that that the server and the client can understand each other in a much better way. Headers contain too many information. Some of them are mentioned below. Please note that we already did see one header modfication with curl, while we modified user-agent .

 

Accept-Ranges: bytes
Cache-Control: max-age=604800
Content-Type: text/html
Date: Tue, 18 Mar 2014 15:42:32 GMT
Etag: "359670651"
Expires: Tue, 25 Mar 2014 15:42:32 GMT
Last-Modified: Fri, 09 Aug 2013 23:54:35 GMT
Server: ECS (iad/19AB)
X-Cache: HIT
x-ec-custom-error: 1
Content-Length: 1270

 

All of the above shown stuff returned by a server are part of HTTP headers. You can modify the HTTP request headers using CURL. Although whether its a good feature of bad requires a serious debate, coz modifying headers can fool a web server receiving the request. Due to this the data found in headers can never be trusted. Let's see how to modify headers while sending request.

 

-H option in curl can be used to modify headers. You can modify any header of your wish. For example, let's modify our Accept and Content-type headers in the request.

 

root@ubuntu1:~# curl -H "Accept: application/xml" -H "Content-Type: application/xml" http://example.com

Verifying SSL certificates using CURL

 

We saw the method to ignore the SSL certificate verification with -k option. But what if you want to verify the certificate the server is replying with. In that case, you need to provide the CA (Certificate Authority Certificate) to curl. This can be done by a simple command line option called --cacert.

 

root@ubuntu1:~# curl --cacert my-ca.crt https://example.com

 

Download a file depending upon file modification time

 

At times, you only want to download a document from a URL if it is modified after your specified time. This is very handy option in curl. Its done by -z option in curl as shown below.

 

root@ubuntu1:~# curl -z 3-Jan-14 http://example.com/myfile.gz

 

The above command will download the file myfile.gz, if its modified after 3rd jan 2014. Hope these examples were helpful.

Rate this article: 
Average: 3.8 (1616 votes)

Comments

Great write-up. Explains the various virtues of curl in a very concise manner.

Thanks for sharing this solid write up. I was not aware of curls diverse usage before reading your guide.

Nice information to know about curl command and its usability.

Sir Pillai and Sir Tiwary,

Just a small suggestion here. This website is amazing. The work that you guys are doing should actually have been done by those employees of Tier 1 companies earning absurd amount of packages instead of not knowing half of what you guys share.

Still this website can go a notch higher in term of help that you are providing, if, you guys can pile up interview related questions too, for freshers. This will not only help them to understand the format of questions asked but also if it comes from 2 of the greatest networking professionals whom I know in India (that's you both), it will prepare them for any kind of questions an interviewer might come up with.

:) :)

Sarath Pillai's picture

Hi Abbhi,

Many thanks for your kind words buddy...That means a lot to me..
The only thing that's stopping me currently from doing that, is lack of time. A lot of my readers have came with a very similar suggestion as you raised. I will surely keep a not of this, in my to-do list.

Thanks again buddy. Chal see you.

-Sarath

Hello, your presentation gave me high hopes but...
I'm not able to find the syntax for downloading multiple files from a windows share.
I tried: curl 'domain%5Cuser_name'@10.21.21.26/Some/Path/ and... is a nope. How can I use curl for download specific files (created in last 24 hours) from a windows share?

Thank you very much to provided such a Good Information.

is there any way to download when cURL return 403 ?

It's really a nice and helpful piece of information. I'm satisfied that you simply
shared this useful

info with us.
Please stay us up to date like this. Thank you for sharing. http://aftershock.gg/forum/profile.php?id=74310

Hi,

Indeed information is very useful.

I am trying to post a input using xml, however I am getting error as unknown host.

My requirement is I have to connect to proxy and from there to actual end system. Below command I am using and getting error.

curl -X POST -u  <username>> http://proxy server url:port -L https:<<endsystem url>> -Hcontent-type:application/xml -d @path of the file.request.xml

Error: Enter host password for user ' ':
curl: (7) couldn't connect to host

Could you please help me in testing with a payload from my box ---> to proxy ----> to actual system.

Greetings,
Madhu Ayli

Hi Both,

It is awesome post, helps me a lot to know more about curl and it's different options.

However, I was looking for how to use config file, to avoid the command line options and for security purpose.

Would you put some light on this.

Thanks in Advance.

how to use -# with curl

Hi Sarath/Satish,

Could you tell me the use of -H option in curl and how can i get the cookie and header information from the http response.

Regards
Arun

gives just enough definition of the critical options!

how to generate externally from the arXiv.org website, using the curl command line tool and a few perl scripts?

Very good article in a concise format.
I have bookmarked this for reference
Thanks a bunch!!!

Add new comment

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.