AI Computer Institute
Expert-curated CS & AI curriculum aligned to CBSE standards. A bharath.ai initiative. About Us

Command Line Mastery

📚 Projects & Applied⏱️ 19 min read🎓 Grade 8
✍️ AI Computer Institute Editorial Team Published: March 2026 CBSE-aligned · Peer-reviewed · 19 min read
Content curated by subject matter experts with IIT/NIT backgrounds. All chapters are fact-checked against official CBSE/NCERT syllabi.

Command Line Mastery: Terminal Like a Professional

The command line is the superpower of every programmer and system administrator. In this chapter, you'll master terminal navigation, file management, powerful command piping, and bash scripting. These skills enable you to automate tasks, manage servers, and work 10x faster than using GUI tools alone.

Why Terminal Skills Matter in Today's Tech World

Every major tech company (Google, Amazon, ISRO, TCS, Infosys) employs teams that work entirely in the terminal. From deploying code to managing servers, cloud infrastructure, and batch processing gigabytes of data—the command line is indispensable. Cloud platforms like AWS, Azure, and Google Cloud are managed primarily through CLI tools. If you want a career in backend development, DevOps, or system administration, terminal mastery is non-negotiable.

Terminal Navigation: Finding Your Way Around

Every directory (folder) has a path. Your terminal always runs in some directory, and knowing where you are is the first step.

pwd                         # Print Working Directory - shows current location
                             # Output: /home/student/projects

ls                          # List files in current directory
ls -la                      # Detailed listing with hidden files and permissions
                             # Output: drwxr-xr-x  student  staff  4096  Jan 20 10:30  .
                             # -rw-r--r--  student  staff  2048  Jan 20 10:15  notes.txt

ls -lh                      # Human-readable file sizes
                             # Output: -rw-r--r--  1 KB  file.txt

cd /home/student/documents  # Change to specific directory
cd ..                       # Go up one directory level
cd .                        # Stay in current directory
cd ~                        # Go to home directory

# Navigating around an e-commerce project
cd ~/projects/ecommerce     # Enter project
cd backend/models           # Navigate to models folder
pwd                         # Verify location: /home/student/projects/ecommerce/backend/models

# Creating directory structures for a project
mkdir -p my_app/src/components    # Create nested directories in one command
                                   # Creates: my_app/ → src/ → components/
      

File Operations: Creating, Copying, Moving, Deleting

Master these operations to manage files like a pro. Remember: rm deletes permanently—there's no undo!

touch file.txt              # Create empty file
touch student_marks.csv     # Create empty CSV

cat file.txt                # Display entire file content
cat -n file.txt             # Display with line numbers

head -10 large_file.txt     # Show first 10 lines
tail -10 large_file.txt     # Show last 10 lines
tail -f logs.txt            # Follow file (live updates for logs)

cp source.txt dest.txt      # Copy file
cp -r source_folder dest_folder  # Recursively copy directory

mv old_name.txt new_name.txt  # Rename or move file
mv file.txt /home/student/documents/  # Move file to another directory

rm file.txt                 # Delete file (permanent!)
rm -i file.txt              # Interactive mode - asks before deleting
rm -r folder_name           # Remove directory and all contents

# Real example: Managing student data files
cd ~/school_data
touch students_2024.csv     # Create roster
cp students_2024.csv students_2024_backup.csv  # Backup
mv students_2024.csv class8_students.csv       # Rename

# Search for files
find . -name "*.txt"        # Find all .txt files in current directory
find . -type f -name "marks*"  # Find all files containing "marks"
      

Searching and Processing Text: grep and sed

grep is one of the most powerful tools in Unix. With it, you can search through millions of lines instantly.

grep "keyword" file.txt     # Find lines containing keyword
grep "error" application.log  # Find all error messages in logs
grep -i "ERROR" file.txt    # Case-insensitive search
grep -c "pattern" file.txt  # Count matching lines
grep -n "pattern" file.txt  # Show line numbers

# Practical example: Finding student records
grep "roll_no: 45" student_database.txt  # Find student with roll number 45
grep "Science" subjects.txt | grep -c "pass"  # Count Science passes

# Using sed to replace text
sed 's/old/new/g' file.txt  # Replace all occurrences
sed 's/Score: [0-9]*/Score: 95/' marks.txt  # Replace scores

# Example: Fix data quality issues
grep "Delhi" addresses.txt  # Find all Delhi addresses
sed 's/Delhi/New Delhi/g' addresses.txt > addresses_fixed.txt  # Standardize
      

Pipes and Redirection: Chaining Commands Like a Boss

Unix philosophy: Do one thing, do it well. Pipes combine simple tools into powerful workflows.

# Pipes (|) - send output from one command to another
cat names.txt | sort       # Sort names
cat names.txt | sort | uniq  # Sort and remove duplicates

ls -la | grep ".py"        # List all Python files
cat logs.txt | grep "error" | wc -l  # Count error lines

# Redirection operators
ls > files.txt             # Redirect output to file (overwrites)
ls >> files.txt            # Append output to file (doesn't overwrite)
cat input.txt > output.txt # Copy file using redirection

# Input redirection
sort < unsorted.txt        # Read from file, sort, display

# Real example: Processing student records for ISRO school coding program
cat student_list.csv | grep "Science" | grep "A+" | wc -l
# Count high-performing Science students

# Combine sort, uniq, and wc to analyze data
cat sales.txt | sort | uniq | wc -l  # Count unique sales records

# Create a report with redirection
echo "School Performance Report" > report.txt
date >> report.txt
grep "A+" marks.txt >> report.txt
echo "Report generated successfully" >> report.txt
      

Environment Variables: Configuration Without Changing Code

Environment variables store configuration like API keys, database URLs, and settings. They make code flexible without hardcoding values.

echo ${PATH}               # Show all directories where commands are found
                             # Directories are separated by colons

echo ${HOME}               # Home directory (/home/username)
echo ${USER}               # Current username

# Set environment variable (temporary - only for this session)
export SCHOOL_NAME="Delhi Public School"
echo ${SCHOOL_NAME}        # Access it

# Use variables in scripts
export DB_HOST="localhost"
export DB_USER="admin"
export DB_PASSWORD="secret123"

# List all environment variables
printenv                    # Show all variables
env                         # Alternative command

# Real example: Configuration for IRCTC ticket system
export IRCTC_API_KEY="your_api_key_here"
export DATABASE_URL="postgresql://localhost:5432/irctc"
export PAYMENT_GATEWAY="UPI"

# Access in Python: os.environ['IRCTC_API_KEY']
      

Shell Scripting: Automating Repetitive Tasks

Bash scripts automate complex workflows. Instead of typing 50 commands, write one script and run it anytime.

#!/bin/bash
# This is a bash script - first line is "shebang"

echo "Welcome to School Management"
echo "Today is $(date)"

# Variables
SCHOOL_NAME="Delhi Public School"
YEAR=2024
echo "School: ${SCHOOL_NAME}, Year: ${YEAR}"

# Read user input
read -p "Enter student roll number: " roll_number
echo "You entered: ${roll_number}"

# Conditional logic
if [ $roll_number -lt 50 ]; then
    echo "Roll number in first half"
else
    echo "Roll number in second half"
fi

# Loop through files
for file in *.txt; do
    echo "Processing: ${file}"
done

# For loop with numbers
for i in {1..10}; do
    echo "Student ${i}"
done

# While loop
counter=1
while [ $counter -le 5 ]; do
    echo "Iteration: $counter"
    counter=$((counter + 1))
done
      

Advanced: Real-World Scripts for School Systems

Here's a production-quality script used in real schools across India (TCS BPS manages education IT systems):

#!/bin/bash
# School Data Backup and Reporting System

# Configuration
BACKUP_DIR="/backups/school_system"
DB_FILE="/data/student_marks.db"
LOG_FILE="/var/log/school_backup.log"
DATE_STAMP=$(date +%Y%m%d_%H%M%S)
ADMIN_EMAIL="principal@school.edu.in"

# Create backup directory if it doesn't exist
mkdir -p ${BACKUP_DIR}

# Function to log messages
log_message() {
    echo "[$(date '+%Y-%m-%d %H:%M:%S')] $1" >> ${LOG_FILE}
}

# Backup database
log_message "Starting backup..."
cp ${DB_FILE} ${BACKUP_DIR}/marks_${DATE_STAMP}.db

if [ $? -eq 0 ]; then
    log_message "Backup successful"
else
    log_message "Backup FAILED"
    echo "Backup failed on $(date)" | mail -s "ALERT: School System Backup Failed" ${ADMIN_EMAIL}
    exit 1
fi

# Check disk usage
DISK_USAGE=$(df /backups | awk 'NR==2 {print $5}' | sed 's/%//')
if [ ${DISK_USAGE} -gt 90 ]; then
    log_message "WARNING: Disk usage at ${DISK_USAGE}%"
fi

# Generate report
echo "=== School System Report ===" > /tmp/report.txt
echo "Backup Location: ${BACKUP_DIR}" >> /tmp/report.txt
echo "Files Backed Up: $(ls ${BACKUP_DIR} | wc -l)" >> /tmp/report.txt
echo "Disk Usage: ${DISK_USAGE}%" >> /tmp/report.txt
echo "Generated: $(date)" >> /tmp/report.txt

# Send report
mail -s "Daily School System Report" ${ADMIN_EMAIL} < /tmp/report.txt
log_message "Report sent to ${ADMIN_EMAIL}"

echo "Script completed successfully"
      

Common Command Cheatsheet

CommandPurposeExample
pwdCurrent directorypwd
lsList filesls -la
cdChange directorycd ~/projects
mkdirCreate directorymkdir -p app/src
cpCopy filescp file.txt backup.txt
mvMove/renamemv old.txt new.txt
rmDelete filesrm -r folder/
grepSearch textgrep "error" app.log
catDisplay filecat readme.txt
findFind filesfind . -name "*.py"
chmodChange permissionschmod +x script.sh
psView processesps aux | grep python

Advanced: System Management and Monitoring

Professional systems administrators use terminal commands to monitor and manage servers:

# Process management
ps aux                      # List all running processes
ps aux | grep python        # Find Python processes
top                         # Real-time system monitor
kill PID                    # Terminate process by ID
kill -9 PID                 # Force kill process

# System resource usage
df -h                       # Disk space usage (human-readable)
du -sh folder_name          # Size of specific folder
free -h                     # RAM usage
uptime                      # System uptime and load average
vmstat                      # Virtual memory statistics

# Real example: Monitor a Flask web server at ISRO
# Monitor if server process is running
if ps aux | grep -q "[f]lask"; then
    echo "Flask server is running"
else
    echo "Flask server DOWN - Alerting admin"
    # Send alert notification
fi

# Check disk usage before backups
DISK_USE=$(df / | awk 'NR==2 {print $5}' | sed 's/%//')
if [ $DISK_USE -gt 85 ]; then
    echo "WARNING: Disk usage high at ${DISK_USE}%"
fi
      

File Permissions and Ownership: Secure Your Files

In Unix, every file has permissions. Control who can read, write, and execute:

# Permission symbols: rwx = read, write, execute
ls -l file.txt              # Shows: -rw-r--r-- (permissions, owner, group)

# Change permissions (chmod - change mode)
chmod 644 file.txt          # Owner: rw, Others: r (readable file)
chmod 755 script.sh         # Owner: rwx, Others: rx (executable file)
chmod +x script.sh          # Add execute permission
chmod -r+w file.txt         # Remove read, add write

# Numeric permission system:
# 4 = read, 2 = write, 1 = execute
# 7 = 4+2+1 = read+write+execute
# 6 = 4+2 = read+write
# 5 = 4+1 = read+execute

# Change ownership (chown - change owner)
chown student file.txt      # Change owner to student
chown student:group file.txt  # Change owner and group
chown -R student folder/    # Recursively change folder ownership

# Real example: Secure school database
# School admin needs full access, students need read-only
chmod 700 school_admin/     # Admin only
chmod 755 school_data/      # Everyone can read/execute
chmod 644 marks.txt         # Everyone can read
      

Advanced Piping: Complex Data Processing Workflows

Combine multiple commands to process complex data efficiently:

# Example 1: Analyze log files (critical for DevOps)
cat access.log | grep "POST" | awk '{print $1}' | sort | uniq -c | sort -rn
# Result: Shows which IPs made POST requests most frequently

# Example 2: Process CSV student data
cat students.csv | cut -d',' -f2 | sort | uniq | wc -l
# Counts unique student names

# Example 3: Find and display large files
find . -type f -exec ls -lh {} + | sort -k5 -hr | head -10
# Find 10 largest files in directory

# Example 4: Real-world: Process IRCTC ticket data (millions of records)
# Count confirmed tickets by destination
cat irctc_bookings.txt | grep "CONFIRMED" | awk -F',' '{print $4}' | sort | uniq -c | sort -rn | head -20
# Result: Top 20 destinations

# Example 5: Create and process student performance reports
# Find students with >90% average across all records
cat student_marks.csv | awk -F',' '{sum+=$2; count++} END {avg=sum/count; if (avg > 90) print "High Performer"}'
      

Cron Jobs: Schedule Automated Tasks

cron lets you schedule scripts to run automatically. Indian banks, government services, and companies use cron for nightly backups, data processing, and maintenance:

# Edit cron jobs
crontab -e                  # Edit your cron schedule
crontab -l                  # List your cron jobs
crontab -r                  # Remove all cron jobs

# Cron syntax: minute hour day month day-of-week command
# Example: Run backup every day at 2 AM
0 2 * * * /home/admin/scripts/backup.sh

# Run every Monday at 10 AM
0 10 * * 1 /home/admin/scripts/weekly_report.sh

# Run every 30 minutes
*/30 * * * * /home/admin/scripts/health_check.sh

# Real examples for school system:
# Backup student records nightly at 11 PM
0 23 * * * cp -r /data/students /backups/students_$(date +%Y%m%d)

# Generate attendance report at 8 AM every weekday
0 8 * * 1-5 python /scripts/generate_attendance.py

# Clean up logs older than 30 days
0 3 * * * find /var/log -name "*.log" -mtime +30 -delete

# System resource check - run every hour
0 * * * * df -h > /var/log/disk_usage.log
      

SSH and Remote Access: Connect to Servers

Most backend work happens on remote servers. Learn SSH (Secure Shell) to connect:

# Connect to remote server
ssh user@server.example.com
ssh admin@192.168.1.100

# Connect with custom port
ssh -p 2222 user@server.example.com

# Copy files to/from server (scp - secure copy)
scp file.txt user@server.com:/home/user/
# Copy from server to local
scp user@server.com:/home/user/file.txt .

# Recursive copy (directories)
scp -r folder/ user@server.com:/home/user/

# Real example: Deploy school portal to server
# Copy updated code to production server
scp -r /home/dev/school_portal/* admin@school-server.edu.in:/var/www/html/

# SSH key authentication (more secure than passwords)
ssh-keygen -t rsa -b 4096      # Generate SSH key pair
ssh-copy-id user@server.com    # Copy public key to server
ssh user@server.com             # Now login without password!
      

Text Processing Tools Mastery

These tools process millions of lines instantly:

# awk - powerful text processing
awk '{print $1}' file.txt              # Print first column
awk -F',' '{print $3}' data.csv        # Print 3rd field from CSV
awk '{sum += $1} END {print sum}' nums.txt  # Sum values

# cut - extract columns
cut -d',' -f1,3 data.csv               # Get columns 1 and 3 from CSV
cut -c1-10 names.txt                   # Get first 10 characters

# sort - sort lines
sort file.txt                          # Alphabetical sort
sort -n numbers.txt                    # Numeric sort
sort -r file.txt                       # Reverse sort
sort -t',' -k2 data.csv                # Sort CSV by column 2

# uniq - remove duplicates (requires sorted input)
sort names.txt | uniq                  # Remove duplicate names
sort names.txt | uniq -c               # Count occurrences

# wc - word/line count
wc file.txt                            # Lines, words, characters
wc -l file.txt                         # Just line count
wc -w file.txt                         # Just word count

# Real example: Analyze student attendance
# Count absent records
grep "ABSENT" attendance.csv | wc -l

# Get unique subjects from data
cut -d',' -f2 student_marks.csv | sort | uniq

# Sum all marks for a student
grep "STUDENT_ID: 45" marks.csv | awk -F',' '{sum += $3} END {print sum}'
      

Performance Tips: Work Faster at the Terminal

  • Use tab completion - press TAB to auto-complete filenames and commands
  • Use history - press UP arrow to recall previous commands, Ctrl+R to search history
  • Create aliases - alias shortnames='long command' to save typing
  • Use wildcards - * matches any characters, ? matches single character
  • Redirect output to files instead of printing - saves time and creates logs
  • Chain multiple pipes for complex operations instead of writing scripts
  • Use background jobs (&) to run multiple commands simultaneously

Key Takeaways

  • The terminal is faster and more powerful than GUI tools for power users
  • Learn pipes (|) to chain commands—this is Unix philosophy at its best
  • Bash scripts automate repetitive tasks and save hours each week
  • File permissions (chmod) secure sensitive files from unauthorized access
  • Cron jobs automate maintenance tasks like backups and log cleanup
  • SSH enables remote server management—essential for cloud and DevOps roles
  • Text processing tools (awk, grep, cut, sort) handle massive datasets instantly
  • Every DevOps engineer, backend developer, and system admin masters these skills
  • Major Indian IT companies (TCS, Infosys, HCL) use shell scripts in production daily
  • Terminal proficiency can increase productivity by 10x compared to GUI tools

Practice Problems

  1. Create a directory structure for a project with subdirectories: src/, tests/, docs/, config/
  2. Write a script that counts how many .py files are in your current directory and their total size
  3. Use grep to find all lines containing "error" in a log file, extract error codes, and count unique errors
  4. Create a backup script that copies all .txt files to a backup directory with today's date and logs the operation
  5. Write a script that asks for your name and displays "Hello [Name], Welcome to Terminal Mastery" with timestamp
  6. Find and list all files modified in the last 7 days in your home directory with their sizes
  7. Write a loop that creates 100 empty files named file1.txt, file2.txt, etc., and verify creation
  8. Set up a cron job that runs a Python script every morning at 6 AM to process school attendance data
  9. Create an SSH key pair, copy public key to a server, and login without password
  10. Write a complex pipeline that processes a CSV file with 1 million student records: filter, sort by marks, find top 100, and save to new file

Think About It

Think about this: How would you explain command line mastery to a friend who has never seen a computer? What real-world analogy would you use? Imagine you had to build a system using these concepts — what would be your first step? Try this: before moving on, write down three things you learned and one question you still have.

← Version Control with GitDesign Patterns for Beginners →

Found this useful? Share it!

📱 WhatsApp 🐦 Twitter 💼 LinkedIn