Shell Scripting: From Basics to Advanced DevOps Techniques

Shell Scripting: From Basics to Advanced DevOps Techniques

Table of Contents

  1. Introduction

  2. Setting Up Your Environment

  3. Shell Scripting Fundamentals

  4. Advanced Shell Scripting Techniques

  5. DevOps-Specific Applications

  6. Best Practices and Optimization

  7. Troubleshooting and Debugging

  8. Real-World Case Studies

  9. Conclusion and Further Resources

Introduction

Welcome to the world of shell scripting! In today's fast-paced DevOps environment, the ability to automate tasks, manage systems efficiently, and streamline workflows is more crucial than ever. Shell scripting is the secret weapon that empowers DevOps engineers to achieve these goals with finesse and precision.

This comprehensive guide will take you on a journey from shell scripting basics to advanced DevOps techniques. Whether you're a beginner looking to get started or an experienced professional aiming to refine your skills, this blog post has something for everyone.

Why Shell Scripting Matters in DevOps

  1. Automation: Eliminate repetitive tasks and reduce human error.

  2. Efficiency: Perform complex operations quickly and consistently.

  3. Flexibility: Adapt to various environments and requirements with ease.

  4. Integration: Seamlessly connect different tools and systems in your DevOps pipeline.

  5. Troubleshooting: Quickly diagnose and resolve issues in production environments.

Let's dive in and unlock the full potential of shell scripting in your DevOps journey!

Setting Up Your Environment

Before we start coding, let's set up a proper environment for our shell scripting adventures.

Lab 1: Creating a Linux Machine on AWS

  1. Sign in to AWS Console

  2. Launch an EC2 Instance:

    • Choose Amazon Linux 2 AMI

    • Select t2.micro instance type (or any suitable for your needs)

    • Configure security group to allow SSH (port 22)

    • Create or use an existing key pair

  3. Connect to Your Instance:

     ssh -i "your-key-pair.pem" ec2-user@your-instance-public-ip
    

Lab 2: Setting Up Your Development Environment

Once connected to your EC2 instance, let's set up a comfortable development environment:

  1. Update your system:

     sudo yum update -y
    
  2. Install useful tools:

     sudo yum install -y git vim tmux
    
  3. Configure your editor (e.g., Vim):

     echo "syntax on
     set number
     set tabstop=4
     set shiftwidth=4
     set expandtab" > ~/.vimrc
    
  4. Create a directory for your scripts:

     mkdir ~/shellscripts
     cd ~/shellscripts
    

Now you're ready to start scripting!

Shell Scripting Fundamentals

Let's begin with the basics and gradually build our skills.

Your First Shell Script

Lab 3: Hello, World!

  1. Create a script file:

     touch hello_world.sh
    
  2. Open the file in Vim:

     vim hello_world.sh
    
  3. Add the following content:

     #!/bin/bash
     echo "Hello, World! Welcome to Shell Scripting."
    
  4. Save and exit (press Esc, type :wq, and hit Enter)

  5. Make the script executable:

     chmod +x hello_world.sh
    
  6. Run the script:

     ./hello_world.sh
    

Congratulations! You've just created and run your first shell script.

Variables and Data Types

Variables are fundamental to any programming language, including shell scripting.

Lab 4: Working with Variables

Create a new script variables.sh:

#!/bin/bash

# String variable
NAME="Alice"

# Integer variable
AGE=30

# Array variable
FRUITS=("apple" "banana" "cherry")

echo "My name is $NAME and I am $AGE years old."
echo "My favorite fruit is ${FRUITS[1]}."

# Command substitution
CURRENT_DATE=$(date +%Y-%m-%d)
echo "Today's date is $CURRENT_DATE"

# Arithmetic operations
FUTURE_AGE=$((AGE + 5))
echo "In 5 years, I will be $FUTURE_AGE years old."

Run the script and observe the output. Try modifying the variables and see how it affects the results.

Control Flow Statements

Control flow statements allow you to make decisions and control the execution of your scripts.

Lab 5: If-Else and Case Statements

Create a script control_flow.sh:

#!/bin/bash

echo "Enter a number between 1 and 3:"
read NUM

# If-Else statement
if [ $NUM -eq 1 ]; then
    echo "You entered one."
elif [ $NUM -eq 2 ]; then
    echo "You entered two."
elif [ $NUM -eq 3 ]; then
    echo "You entered three."
else
    echo "Invalid input."
fi

# Case statement
echo "Enter a fruit name (apple, banana, or cherry):"
read FRUIT

case $FRUIT in
    "apple")
        echo "Apples are red."
        ;;
    "banana")
        echo "Bananas are yellow."
        ;;
    "cherry")
        echo "Cherries are red."
        ;;
    *)
        echo "Unknown fruit."
        ;;
esac

Run the script with different inputs to see how it behaves.

Loops

Loops allow you to execute a block of code repeatedly.

Lab 6: For and While Loops

Create a script loops.sh:

#!/bin/bash

# For loop
echo "Counting from 1 to 5:"
for i in {1..5}
do
    echo $i
done

# While loop
echo "Countdown from 5 to 1:"
count=5
while [ $count -gt 0 ]
do
    echo $count
    count=$((count - 1))
    sleep 1
done
echo "Blast off!"

# Loop through an array
COLORS=("red" "green" "blue" "yellow")
for color in "${COLORS[@]}"
do
    echo "Color: $color"
done

Run the script and observe how different types of loops work.

Functions

Functions help you organize and reuse code.

Lab 7: Creating and Using Functions

Create a script functions.sh:

#!/bin/bash

# Simple function
greet() {
    echo "Hello, $1! Nice to meet you."
}

# Function with return value
calculate_sum() {
    local num1=$1
    local num2=$2
    local sum=$((num1 + num2))
    echo $sum
}

# Main script
echo "Enter your name:"
read NAME
greet $NAME

echo "Enter two numbers to add:"
read NUM1
read NUM2
RESULT=$(calculate_sum $NUM1 $NUM2)
echo "The sum of $NUM1 and $NUM2 is $RESULT"

Run the script and experiment with different inputs.

Advanced Shell Scripting Techniques

Now that we've covered the basics, let's dive into more advanced techniques that will take your shell scripting skills to the next level.

Command Line Arguments

Command line arguments allow you to pass information to your script when you run it.

Lab 8: Working with Command Line Arguments

Create a script cli_args.sh:

#!/bin/bash

# Check if at least one argument is provided
if [ $# -eq 0 ]; then
    echo "Usage: $0 <name> [age]"
    exit 1
fi

# Access command line arguments
NAME=$1
AGE=${2:-"Unknown"}  # Use "Unknown" if age is not provided

echo "Name: $NAME"
echo "Age: $AGE"

# Loop through all arguments
echo "All arguments:"
for arg in "$@"
do
    echo "$arg"
done

# Total number of arguments
echo "Total arguments: $#"

Run the script with different arguments:

./cli_args.sh Alice 30
./cli_args.sh Bob

Input/Output Redirection and Pipes

I/O redirection and pipes are powerful features that allow you to manipulate data flow in your scripts.

Lab 9: I/O Redirection and Pipes

Create a script io_redirection.sh:

#!/bin/bash

# Write to a file
echo "This is a sample file." > sample.txt
echo "Adding another line." >> sample.txt

# Read from a file
echo "Contents of sample.txt:"
cat sample.txt

# Use pipes to filter output
echo "Lines containing 'line':"
cat sample.txt | grep "line"

# Combine multiple commands
echo "Sorted unique words:"
cat sample.txt | tr ' ' '\n' | sort | uniq

# Redirect errors to a file
ls non_existent_file 2> errors.log

echo "Errors logged to errors.log"

Run the script and examine the output and created files.

Regular Expressions

Regular expressions (regex) are powerful tools for pattern matching and text manipulation.

Lab 10: Using Regular Expressions

Create a script regex_example.sh:

#!/bin/bash

# Sample text
TEXT="The quick brown fox jumps over the lazy dog. The dog is very lazy."

# Basic pattern matching
echo "$TEXT" | grep "fox"

# Case-insensitive matching
echo "$TEXT" | grep -i "DOG"

# Word boundary matching
echo "$TEXT" | grep -w "is"

# Extended regular expressions
echo "$TEXT" | grep -E "quick|lazy"

# Capture groups
echo "$TEXT" | sed -E 's/(fox|dog)/\U\1/g'

# Validate email format
validate_email() {
    local email=$1
    if [[ $email =~ ^[A-Za-z0-9._%+-]+@[A-Za-z0-9.-]+\.[A-Z|a-z]{2,}$ ]]; then
        echo "Valid email format"
    else
        echo "Invalid email format"
    fi
}

validate_email "user@example.com"
validate_email "invalid.email@"

Run the script and observe how regular expressions can be used for various text processing tasks.

DevOps-Specific Applications

Now, let's explore how shell scripting can be applied to common DevOps tasks.

Automating System Updates

Lab 11: System Update Script

Create a script update_system.sh:

#!/bin/bash

LOG_FILE="/var/log/system_update.log"

# Function to log messages
log_message() {
    echo "$(date '+%Y-%m-%d %H:%M:%S') - $1" >> "$LOG_FILE"
}

# Update package lists
log_message "Updating package lists..."
sudo yum update -y >> "$LOG_FILE" 2>&1

# Upgrade installed packages
log_message "Upgrading installed packages..."
sudo yum upgrade -y >> "$LOG_FILE" 2>&1

# Clean up
log_message "Cleaning up..."
sudo yum clean all >> "$LOG_FILE" 2>&1

log_message "System update complete!"

echo "System update finished. Check $LOG_FILE for details."

Make the script executable and run it to update your EC2 instance.

Monitoring System Resources

Lab 12: Resource Monitoring Script

Create a script monitor_resources.sh:

#!/bin/bash

# Function to get CPU usage
get_cpu_usage() {
    top -bn1 | grep "Cpu(s)" | sed "s/.*, *\([0-9.]*\)%* id.*/\1/" | awk '{print 100 - $1"%"}'
}

# Function to get memory usage
get_memory_usage() {
    free | grep Mem | awk '{print $3/$2 * 100.0"%"}'
}

# Function to get disk usage
get_disk_usage() {
    df -h | awk '$NF=="/"{printf "%s", $5}'
}

# Main monitoring loop
while true
do
    clear
    echo "System Resource Monitor"
    echo "----------------------"
    echo "CPU Usage: $(get_cpu_usage)"
    echo "Memory Usage: $(get_memory_usage)"
    echo "Disk Usage: $(get_disk_usage)"
    sleep 5
done

Run this script to get a real-time view of your system's resource usage.

Log Analysis

Lab 13: Apache Log Analyzer

First, let's create a sample Apache log file for demonstration purposes:

cat << EOF > sample_apache.log
192.168.1.100 - - [10/Apr/2023:13:55:36 -0700] "GET /index.html HTTP/1.1" 200 2326
192.168.1.101 - - [10/Apr/2023:13:55:36 -0700] "GET /css/style.css HTTP/1.1" 200 1234
192.168.1.100 - - [10/Apr/2023:13:55:37 -0700] "GET /js/script.js HTTP/1.1" 200 4321
192.168.1.102 - - [10/Apr/2023:13:55:38 -0700] "POST /login HTTP/1.1" 302 -
192.168.1.103 - - [10/Apr/2023:13:55:39 -0700] "GET /admin HTTP/1.1" 403 1234
192.168.1.100 - - [10/Apr/2023:13:55:40 -0700] "GET /images/logo.png HTTP/1.1" 200 5678
192.168.1.104 - - [10/Apr/2023:13:55:41 -0700] "GET /non-existent-page HTTP/1.1" 404 897
EOF

OR This script generates a sample Apache log file with configurable parameters. Here's a breakdown of its features:

  1. Configurable output: You can set the output file name and the number of log entries to generate.

  2. Date range: You can specify a start and end date for the log entries.

  3. Realistic data: The script uses arrays of realistic values for IP addresses, HTTP methods, pages, status codes, and user agents.

  4. Random generation: Each log entry is generated with random values from these arrays, creating a diverse and realistic log file.

  5. Customizable: You can easily modify the arrays to include different values or add more variety.

To use this script:

  1. Save it as apache_log_generator.sh

  2. Make it executable: chmod +x apache_log_generator.sh

  3. Run it: ./apache_log_generator.sh

After running this script, you'll have a sample_apache.log file that you can use with the Apache log analyzer script

#!/bin/bash

# Configuration
OUTPUT_FILE="sample_apache.log"
NUM_ENTRIES=1000
START_DATE="2023-04-10"
END_DATE="2023-04-11"

# Arrays for random data generation
IP_ADDRESSES=("192.168.1.100" "192.168.1.101" "192.168.1.102" "192.168.1.103" "192.168.1.104")
HTTP_METHODS=("GET" "POST" "PUT" "DELETE")
PAGES=("/index.html" "/about.html" "/contact.html" "/products.html" "/services.html" "/login" "/logout" "/api/v1/users" "/images/logo.png" "/css/style.css" "/js/script.js")
HTTP_VERSIONS=("HTTP/1.0" "HTTP/1.1" "HTTP/2.0")
STATUS_CODES=("200" "201" "204" "301" "302" "304" "400" "401" "403" "404" "500" "502" "503")
USER_AGENTS=(
    "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36"
    "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/14.1.1 Safari/605.1.15"
    "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:89.0) Gecko/20100101 Firefox/89.0"
    "Mozilla/5.0 (iPhone; CPU iPhone OS 14_6 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/14.0 Mobile/15E148 Safari/604.1"
    "Mozilla/5.0 (Linux; Android 11; SM-G991B) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.120 Mobile Safari/537.36"
)

# Function to generate a random date between START_DATE and END_DATE
random_date() {
    start_epoch=$(date -d "$START_DATE" +%s)
    end_epoch=$(date -d "$END_DATE" +%s)
    random_epoch=$((RANDOM % (end_epoch - start_epoch) + start_epoch))
    date -d @$random_epoch "+%d/%b/%Y:%H:%M:%S %z"
}

# Function to get a random element from an array
random_element() {
    local array=("$@")
    echo "${array[$RANDOM % ${#array[@]}]}"
}

# Function to generate a random number within a range
random_number() {
    local min=$1
    local max=$2
    echo $((RANDOM % (max - min + 1) + min))
}

# Generate log entries
for ((i=1; i<=NUM_ENTRIES; i++)); do
    ip=$(random_element "${IP_ADDRESSES[@]}")
    date=$(random_date)
    method=$(random_element "${HTTP_METHODS[@]}")
    page=$(random_element "${PAGES[@]}")
    http_version=$(random_element "${HTTP_VERSIONS[@]}")
    status=$(random_element "${STATUS_CODES[@]}")
    bytes=$(random_number 100 5000)
    user_agent=$(random_element "${USER_AGENTS[@]}")

    echo "$ip - - [$date] \"$method $page $http_version\" $status $bytes \"-\" \"$user_agent\"" >> "$OUTPUT_FILE"
done

echo "Generated $NUM_ENTRIES log entries in $OUTPUT_FILE"

Now, create a script apache_log_analyzer.sh:

#!/bin/bash

# Configuration
LOG_FILE="sample_apache.log"
OUTPUT_FILE="apache_log_analysis_$(date +%Y%m%d_%H%M%S).txt"

# Function to print section headers
print_header() {
    echo -e "\n=== $1 ===" | tee -a "$OUTPUT_FILE"
}

# Start analysis
echo "Apache Log Analysis" | tee "$OUTPUT_FILE"
echo "$(date)" | tee -a "$OUTPUT_FILE"
echo "------------------" | tee -a "$OUTPUT_FILE"

# Total number of requests
print_header "Total Requests"
total_requests=$(wc -l < "$LOG_FILE")
echo "Total requests: $total_requests" | tee -a "$OUTPUT_FILE"

# Most common HTTP status codes
print_header "Most Common HTTP Status Codes"
awk '{print $9}' "$LOG_FILE" | sort | uniq -c | sort -rn | head -5 | \
    awk '{printf "%-5s : %s (%.2f%%)\n", $2, $1, $1/'"$total_requests"'*100}' | tee -a "$OUTPUT_FILE"

# Top 5 IP addresses
print_header "Top 5 IP Addresses"
awk '{print $1}' "$LOG_FILE" | sort | uniq -c | sort -rn | head -5 | \
    awk '{printf "%-15s : %s (%.2f%%)\n", $2, $1, $1/'"$total_requests"'*100}' | tee -a "$OUTPUT_FILE"

# Most requested pages
print_header "Most Requested Pages"
awk '{print $7}' "$LOG_FILE" | sort | uniq -c | sort -rn | head -5 | \
    awk '{printf "%-30s : %s (%.2f%%)\n", $2, $1, $1/'"$total_requests"'*100}' | tee -a "$OUTPUT_FILE"

# Requests per hour
print_header "Requests per Hour"
awk '{print $4}' "$LOG_FILE" | cut -d: -f2 | sort | uniq -c | sort -n | \
    awk '{printf "%02d:00 - %02d:59 : %s\n", $2, $2, $1}' | tee -a "$OUTPUT_FILE"

# HTTP methods distribution
print_header "HTTP Methods Distribution"
awk '{print $6}' "$LOG_FILE" | tr -d '"' | sort | uniq -c | sort -rn | \
    awk '{printf "%-6s : %s (%.2f%%)\n", $2, $1, $1/'"$total_requests"'*100}' | tee -a "$OUTPUT_FILE"

# Percentage of 4xx (client error) responses
print_header "Client Error (4xx) Responses"
errors=$(awk '$9 ~ /^4/ {count++} END {print count}' "$LOG_FILE")
percentage=$(awk "BEGIN {printf \"%.2f\", $errors / $total_requests * 100}")
echo "Number of 4xx responses: $errors" | tee -a "$OUTPUT_FILE"
echo "Percentage of 4xx responses: $percentage%" | tee -a "$OUTPUT_FILE"

# Percentage of 5xx (server error) responses
print_header "Server Error (5xx) Responses"
server_errors=$(awk '$9 ~ /^5/ {count++} END {print count}' "$LOG_FILE")
server_error_percentage=$(awk "BEGIN {printf \"%.2f\", $server_errors / $total_requests * 100}")
echo "Number of 5xx responses: $server_errors" | tee -a "$OUTPUT_FILE"
echo "Percentage of 5xx responses: $server_error_percentage%" | tee -a "$OUTPUT_FILE"

# Average response size
print_header "Average Response Size"
avg_size=$(awk '$10 ~ /^[0-9]+$/ {total += $10; count++} END {printf "%.2f", total/count}' "$LOG_FILE")
echo "Average response size: $avg_size bytes" | tee -a "$OUTPUT_FILE"

# Busiest day
print_header "Busiest Day"
busiest_day=$(awk '{print $4}' "$LOG_FILE" | cut -d: -f1 | sort | uniq -c | sort -rn | head -1 | awk '{print $2 " (" $1 " requests)"}')
echo "Busiest day: $busiest_day" | tee -a "$OUTPUT_FILE"

# Top user agents
print_header "Top User Agents"
awk -F'"' '{print $6}' "$LOG_FILE" | sort | uniq -c | sort -rn | head -5 | \
    awk '{printf "%-50s : %s\n", substr($2, 1, 50), $1}' | tee -a "$OUTPUT_FILE"

echo -e "\nAnalysis complete. Full report saved to $OUTPUT_FILE"

Here are more advanced examples of shell scripts that are particularly useful in DevOps environments. These scripts will cover various aspects of system administration, deployment, monitoring, and automation.

These advanced DevOps scripts cover a wide range of common tasks and challenges in system administration and automation. Here's a brief explanation of each script:

  1. Automated Backup Script with Rotation: Creates backups of a specified directory and maintains a set number of recent backups, deleting older ones.

     #########################
     # 1. Automated Backup Script with Rotation
     #########################
    
     #!/bin/bash
    
     BACKUP_DIR="/path/to/backups"
     SOURCE_DIR="/path/to/source"
     MAX_BACKUPS=7
    
     # Create backup filename with timestamp
     BACKUP_FILENAME="backup_$(date +%Y%m%d_%H%M%S).tar.gz"
    
     # Create backup
     tar -czf "$BACKUP_DIR/$BACKUP_FILENAME" -C "$SOURCE_DIR" .
    
     # Rotate old backups
     while [ $(ls -1 "$BACKUP_DIR" | wc -l) -gt $MAX_BACKUPS ]
     do
         OLDEST_BACKUP=$(ls -1t "$BACKUP_DIR" | tail -1)
         rm "$BACKUP_DIR/$OLDEST_BACKUP"
         echo "Removed old backup: $OLDEST_BACKUP"
     done
    
     echo "Backup completed: $BACKUP_FILENAME"
    
  2. Docker Container Health Check and Auto-Restart: Monitors a Docker container and automatically restarts it if it stops running, with a maximum number of restart attempts.

     #########################
     # 2. Docker Container Health Check and Auto-Restart
     #########################
    
     #!/bin/bash
    
     CONTAINER_NAME="my-app-container"
     MAX_RESTARTS=3
     RESTART_DELAY=60
    
     check_container() {
         docker inspect -f '{{.State.Running}}' "$CONTAINER_NAME" 2>/dev/null
     }
    
     restart_count=0
    
     while true; do
         if [ "$(check_container)" != "true" ]; then
             echo "Container $CONTAINER_NAME is not running. Attempting to restart..."
             docker start "$CONTAINER_NAME"
             restart_count=$((restart_count + 1))
    
             if [ $restart_count -ge $MAX_RESTARTS ]; then
                 echo "Max restart attempts reached. Please check the container manually."
                 exit 1
             fi
         else
             restart_count=0
         fi
    
         sleep $RESTART_DELAY
     done
    
  3. Multi-server Configuration Sync: Synchronizes a configuration file across multiple servers and restarts the relevant service on each server.

     #########################
     # 3. Multi-server Configuration Sync
     #########################
    
     #!/bin/bash
    
     CONFIG_FILE="/path/to/config.yml"
     SERVERS=("server1.example.com" "server2.example.com" "server3.example.com")
     REMOTE_PATH="/etc/myapp/config.yml"
    
     for server in "${SERVERS[@]}"; do
         echo "Syncing configuration to $server..."
         scp "$CONFIG_FILE" "user@$server:$REMOTE_PATH"
    
         ssh "user@$server" << EOF
             sudo systemctl restart myapp.service
             echo "Configuration updated and service restarted on $server"
     EOF
     done
    
     echo "Configuration sync completed for all servers."
    
  4. Advanced Log Analysis with Alert: Analyzes a log file for errors and sends an email alert if the number of errors exceeds a threshold.

     #########################
     # 4. Advanced Log Analysis with Alert
     #########################
    
     #!/bin/bash
    
     LOG_FILE="/var/log/myapp.log"
     ERROR_THRESHOLD=10
     ALERT_EMAIL="admin@example.com"
    
     error_count=$(grep -c "ERROR" "$LOG_FILE")
    
     if [ $error_count -ge $ERROR_THRESHOLD ]; then
         subject="ALERT: High number of errors in application log"
         body="There are $error_count errors in the log file $LOG_FILE. Please investigate."
    
         echo "$body" | mail -s "$subject" "$ALERT_EMAIL"
    
         # Additionally, you could trigger other actions here, like restarting the service
         # sudo systemctl restart myapp.service
     fi
    
  5. Automated Database Backup and Cloud Upload: Performs a MySQL database backup, compresses it, uploads it to an S3 bucket, and cleans up the local copy.

     #########################
     # 5. Automated Database Backup and Cloud Upload
     #########################
    
     #!/bin/bash
    
     DB_NAME="myapp_db"
     DB_USER="dbuser"
     DB_PASS="dbpassword"
     BACKUP_DIR="/path/to/db_backups"
     S3_BUCKET="s3://my-backup-bucket"
    
     # Ensure backup directory exists
     mkdir -p "$BACKUP_DIR"
    
     # Create backup filename with timestamp
     BACKUP_FILE="$BACKUP_DIR/${DB_NAME}_$(date +%Y%m%d_%H%M%S).sql.gz"
    
     # Perform database backup
     mysqldump -u "$DB_USER" -p"$DB_PASS" "$DB_NAME" | gzip > "$BACKUP_FILE"
    
     # Upload to S3
     aws s3 cp "$BACKUP_FILE" "$S3_BUCKET/"
    
     # Clean up local backup
     rm "$BACKUP_FILE"
    
     echo "Database backup completed and uploaded to S3: ${BACKUP_FILE##*/}"
    
  6. Kubernetes Pod Restart Script: Safely restarts all pods in a Kubernetes deployment by scaling down to zero and then back up to the original replica count.

     #########################
     # 6. Kubernetes Pod Restart Script
     #########################
    
     #!/bin/bash
    
     NAMESPACE="myapp-namespace"
     DEPLOYMENT="myapp-deployment"
    
     # Get current replica count
     REPLICAS=$(kubectl get deployment "$DEPLOYMENT" -n "$NAMESPACE" -o=jsonpath='{.spec.replicas}')
    
     echo "Current replicas: $REPLICAS"
    
     # Scale down to 0
     kubectl scale deployment "$DEPLOYMENT" -n "$NAMESPACE" --replicas=0
     echo "Scaled down to 0 replicas"
    
     # Wait for pods to terminate
     kubectl wait --for=delete pod -l app="$DEPLOYMENT" -n "$NAMESPACE" --timeout=60s
    
     # Scale back up to original count
     kubectl scale deployment "$DEPLOYMENT" -n "$NAMESPACE" --replicas="$REPLICAS"
     echo "Scaling back up to $REPLICAS replicas"
    
     # Wait for pods to be ready
     kubectl wait --for=condition=ready pod -l app="$DEPLOYMENT" -n "$NAMESPACE" --timeout=120s
    
     echo "Deployment $DEPLOYMENT restarted successfully"
    
  7. SSL Certificate Expiry Checker: Checks the expiry dates of SSL certificates for multiple domains and sends an alert if any are close to expiring.

     #########################
     # 7. SSL Certificate Expiry Checker
     #########################
    
     #!/bin/bash
    
     DOMAINS=("example.com" "api.example.com" "blog.example.com")
     DAYS_THRESHOLD=30
     ALERT_EMAIL="admin@example.com"
    
     for domain in "${DOMAINS[@]}"; do
         expiry_date=$(echo | openssl s_client -servername "$domain" -connect "$domain":443 2>/dev/null | openssl x509 -noout -enddate | cut -d= -f2)
         expiry_epoch=$(date -d "$expiry_date" +%s)
         current_epoch=$(date +%s)
         days_left=$(( (expiry_epoch - current_epoch) / 86400 ))
    
         if [ $days_left -le $DAYS_THRESHOLD ]; then
             echo "WARNING: SSL certificate for $domain will expire in $days_left days" | mail -s "SSL Certificate Expiry Alert" "$ALERT_EMAIL"
         fi
     done
    
  8. Automated Security Updates: Automatically installs security updates on a Linux system and schedules a reboot if necessary.

     #########################
     # 8. Automated Security Updates
     #########################
    
     #!/bin/bash
    
     LOG_FILE="/var/log/security_updates.log"
    
     echo "Starting security updates at $(date)" >> "$LOG_FILE"
    
     # Update package lists
     apt-get update >> "$LOG_FILE" 2>&1
    
     # Install security updates
     apt-get -s dist-upgrade | grep "^Inst" | grep -i securi | awk -F " " {'print $2'} | xargs apt-get install -y >> "$LOG_FILE" 2>&1
    
     echo "Security updates completed at $(date)" >> "$LOG_FILE"
    
     # Check if reboot is required
     if [ -f /var/run/reboot-required ]; then
         echo "Reboot is required. Scheduling reboot for midnight." >> "$LOG_FILE"
         echo "shutdown -r 00:00" | at now
     fi
    
     echo "Update process completed."
    

These scripts demonstrate advanced techniques such as:

  • Working with external services (Docker, Kubernetes, AWS S3)

  • System monitoring and automated responses

  • Multi-server management

  • Security and compliance checks

  • Automated maintenance tasks

To use these scripts:

  1. Extract each script into its own file.

  2. Modify the variables and paths to match your environment.

  3. Make the scripts executable with chmod +x script_name.sh.

  4. Run them manually or set them up as cron jobs for regular execution.

Remember to test these scripts in a non-production environment before using them in production, and always ensure you have proper backups and failsafes in place.