Skip to main content

Linux Pipes Explained: Practical Examples for Everyday Use

·547 words·3 mins
Linux Command Line Shell
Table of Contents

In Linux, the vertical bar (|), commonly called a pipe, is one of the most powerful tools available on the command line. Pipes allow the output of one command to be used directly as the input of another, making it possible to build complex workflows from simple, single-purpose tools.

This article introduces the core concept of Linux pipes and walks through practical examples commonly used in development and system operations.

🧠 Basic Concept of Linux Pipes
#

A pipe connects the standard output (stdout) of one command to the standard input (stdin) of the next. Instead of writing intermediate results to a file, data flows directly from one command to another.

A simple example:

cat file.txt | grep "keyword"

Here:

  • cat file.txt outputs the file contents
  • grep "keyword" receives that output and filters matching lines

This approach keeps commands concise, readable, and efficient.

πŸ“Š Counting Specific Log Entries
#

Log analysis is a classic use case for pipes. Suppose you want to count how many times "ERROR" appears in a log file:

cat error.log | grep "ERROR" | wc -l

Command breakdown:

  • cat error.log β€” outputs the log file
  • grep "ERROR" β€” filters lines containing ERROR
  • wc -l β€” counts matching lines

This pipeline turns a multi-step task into a single command.

πŸ“ Finding the Top 3 File Extensions
#

To identify the most common file types in a directory:

ls -l | awk '{print $NF}' | rev | cut -d. -f1 | rev | sort | uniq -c | sort -nr | head -n 3

What each stage does:

  • ls -l β€” lists files
  • awk '{print $NF}' β€” extracts filenames
  • rev | cut -d. -f1 | rev β€” extracts file extensions
  • sort | uniq -c β€” counts occurrences
  • sort -nr β€” sorts by count (descending)
  • head -n 3 β€” shows the top three

This example highlights how pipes enable powerful data processing with standard tools.

πŸ”Œ Checking Port Usage and Killing a Process
#

To find and terminate the process using port 80:

sudo netstat -tuln | grep ':80' | awk '{print $7}' | cut -d/ -f1 | xargs sudo kill -9

Pipeline steps:

  • netstat -tuln β€” lists listening ports
  • grep ':80' β€” filters port 80
  • awk '{print $7}' β€” extracts PID information
  • cut -d/ -f1 β€” isolates the numeric PID
  • xargs sudo kill -9 β€” terminates the process

⚠️ Note: Use kill -9 with caution, as it forcefully stops processes without cleanup.

πŸ“ Viewing and Saving Output Simultaneously
#

Sometimes you want to see command output while also saving it to a file. The tee command enables this:

ps aux | tee output.txt | wc -l

Explanation:

  • ps aux β€” lists all running processes
  • tee output.txt β€” writes output to a file and forwards it
  • wc -l β€” counts the number of processes

Think of tee as a T-junction in the pipeline.

βœ… Summary
#

Linux pipes transform simple commands into flexible, expressive workflows. By chaining tools together:

  • Temporary files become unnecessary
  • Commands stay modular and readable
  • Complex tasks become fast and repeatable

The real power of pipes lies in composability. Each command focuses on one job, and the pipe handles data flow. Mastering this concept turns the Linux command line into a highly efficient problem-solving toolβ€”whether you’re debugging logs, managing processes, or analyzing data.

Related

How to Implement Router Functionality on Linux
·549 words·3 mins
Linux Router
Advanced GDB Debugging for Multi-Threaded Programs
·519 words·3 mins
GDB C++ Multi-Threading Linux
Free Linux Antivirus Setup Guide
·451 words·3 mins
Antivirus ClamAV Linux