Pipes

Using pipes to connect commands together and create data processing pipelines in Linux

Pipes in Linux allow you to connect commands together, sending the output of one command as input to another. This powerful feature enables complex data processing by chaining simple commands together.

Key Concepts

  • Pipe Symbol (|): Connects stdout of one command to stdin of another
  • Pipeline: Chain of commands connected by pipes
  • Stream Redirection: Data flows from left to right through the pipeline
  • Filter Commands: Commands designed to process piped input

Command Syntax

command1 | command2 | command3

  • Output of command1 becomes input for command2
  • Output of command2 becomes input for command3
  • Data flows sequentially through the pipeline

Common Pipe Combinations

| grep pattern - Filter lines containing pattern | sort - Sort the output alphabetically | wc -l - Count lines in output | head -n - Show first n lines | tail -n - Show last n lines | uniq - Remove duplicate lines

Practical Examples

Example 1: List and count files

1
ls -la | wc -l

Counts total files and directories in current folder

Example 2: Find running processes

1
ps aux | grep firefox

Shows all Firefox processes currently running

Example 3: Sort disk usage

1
du -h /var/log/* | sort -hr

Lists log files by size, largest first

Example 4: Multiple pipe chain

1
cat /etc/passwd | grep -v nologin | cut -d: -f1 | sort

Shows users with login shells, sorted alphabetically

Example 5: Network connections

1
netstat -an | grep :80 | wc -l

Counts active connections on port 80

Use Cases

  • Log Analysis: Filter and analyze log files
  • Data Processing: Transform and format text data
  • System Monitoring: Combine commands for detailed info
  • File Management: Search, sort, and organize files
  • Report Generation: Create formatted output

tee - Send output to file AND next command xargs - Convert input to command arguments
> - Redirect output to file >> - Append output to file < - Redirect file as input

Tips & Troubleshooting

Performance Tips

  • Use specific filters early in pipeline
  • Avoid unnecessary pipes with built-in options
  • Consider grep -v instead of complex exclusions

Common Issues

  • Broken Pipe: Occurs when early command exits
  • Permission Errors: May stop entire pipeline
  • Memory Usage: Long pipelines can consume RAM

Best Practices

  • Test each command separately first
  • Use quotes around patterns with spaces
  • Consider intermediate files for complex operations
  • Remember pipes work with text streams only

Debugging Pipelines

1
2
# Add tee to see intermediate output
command1 | tee debug.txt | command2

Saves intermediate results while continuing pipeline