The curl command is a powerful command-line tool for transferring data to/from servers using various protocols (HTTP, HTTPS, FTP, etc.). It’s essential for testing APIs, downloading files, and web development tasks.
Key Concepts
- URL: Uniform Resource Locator - the web address
- HTTP Methods: GET, POST, PUT, DELETE for different operations
- Headers: Metadata sent with requests (authentication, content type)
- Response Codes: Status indicators (200 OK, 404 Not Found)
- SSL/TLS: Secure communication protocols
Command Syntax
curl [options] URL
- Transfers data to/from servers
- Options control request method, headers, output
- Supports multiple protocols and authentication
Common Options
-X METHOD - Specify HTTP method (GET, POST, etc.)
-H "header" - Add custom header
-d "data" - Send data in request body
-o filename - Save output to file
-O - Save with remote filename
-I - Show headers only
-L - Follow redirects
-v - Verbose output for debugging
-s - Silent mode (no progress)
-k - Ignore SSL certificate errors
Practical Examples
Example 1: Basic GET request
|
|
Fetches user data from GitHub API and displays JSON response
Example 2: Download file
|
|
Downloads file.zip to current directory with same name
Example 3: POST data
|
|
Sends JSON data to create new user
Example 4: Authentication
|
|
Uses basic authentication to access protected resource
Example 5: Save response to file
|
|
Saves API response to response.json file
Use Cases
- Testing REST APIs during development
- Downloading files and web content
- Automating web requests in scripts
- Checking website availability and response times
- Uploading files to servers
- Web scraping and data collection
Related Commands
wget - Alternative download tool, simpler syntax
httpie - User-friendly HTTP client
lynx - Text-based web browser
nc - Network connection utility
Tips & Troubleshooting
- Use
-vflag to debug connection issues - Add
-kfor self-signed SSL certificates - Use
-Lto handle redirects automatically - Check response codes with
-w "%{http_code}" - For large downloads, use
-C -to resume - Set timeout with
--connect-timeout 30 - Use
--retry 3for unreliable connections - Escape special characters in URLs with quotes