Feeds:
Posts
Comments

Archive for February, 2023

In this article, I’ll walk you through creating a lightweight Continuous Integration and Continuous Deployment (CI/CD) pipeline using a Bash script to automate uploads to an AWS S3 bucket. This solution accelerates deployment and feedback cycles without complex tools or costly infrastructure—a practical “poor man’s CI/CD pipeline” for DevOps engineers and developers alike. I’ll also compare this approach with two alternatives to illustrate different ways to achieve the same outcome, highlighting their strengths and trade-offs.

Why Automate AWS S3 Uploads?

With AWS S3’s rise as a scalable, cost-effective storage solution, manually uploading artifacts for testing or deployment can bottleneck development workflows. Automating this process saves time, ensures consistency, and aligns with DevOps principles of efficiency and repeatability. This article presents a Bash-based solution and evaluates it against other options to demonstrate versatile problem-solving.

Approach 1: The Bash Script Solution

How It Works

This script uses the AWS CLI to upload files from two local directories to an S3 bucket. Key features include:

  • Input Flexibility: Accepts file extensions (e.g., .zip, .jar) or all as arguments.
  • Dual-Directory Support: Scans dir1 and dir2 for matching files.
  • Branch Management: Checks out a specified Git branch (defaults to master).
  • Error Handling: Validates AWS CLI presence and parameter usage.

The Script

#!/bin/bash

# Configuration
BUCKET_NAME="<bucket-name>"
S3_LOCATION="<s3-location>"
DIR1="<directory-path-1>"
DIR2="<directory-path-2>"
DEFAULT_BRANCH="master"

# Verify AWS CLI installation
if ! command -v aws >/dev/null 2>&1; then
    echo "Error: AWS CLI is not installed. Please install it and configure your credentials."
    exit 1
fi

# Function to upload files to S3
upload_to_s3() {
    local dir="$1"
    local extension="$2"
    for file in $(find "$dir" -name "*$extension"); do
        filename=$(basename "$file")
        aws s3 cp "$file" "s3://$BUCKET_NAME/$S3_LOCATION/$filename"
        if [ $? -eq 0 ]; then
            echo "Uploaded: $filename"
        else
            echo "Error: Failed to upload $filename"
            exit 1
        fi
    done
}

# Checkout specified branch
BRANCH_NAME="${BRANCH_NAME:-$DEFAULT_BRANCH}"
echo "Switching to branch: $BRANCH_NAME"
git checkout "$BRANCH_NAME" || { echo "Error: Failed to checkout $BRANCH_NAME"; exit 1; }

# Process arguments
if [ $# -eq 0 ]; then
    echo "Error: Please provide at least one file extension (e.g., '.zip') or 'all'."
    echo "Usage: $0 [extension1] [extension2] ... | $0 all"
    exit 1
elif [ "$1" == "all" ] && [ $# -eq 1 ]; then
    echo "Uploading all files from $DIR1 and $DIR2..."
    upload_to_s3 "$DIR1" ""
    upload_to_s3 "$DIR2" ""
else
    for ext in "$@"; do
        if [ "$ext" == "all" ]; then
            echo "Error: 'all' cannot be combined with specific extensions."
            echo "Usage: $0 [extension1] [extension2] ... | $0 all"
            exit 1
        fi
        echo "Uploading files with extension: $ext"
        upload_to_s3 "$DIR1" "$ext"
        upload_to_s3 "$DIR2" "$ext"
    done
fi

echo "Upload process completed."

Customization

Adjust BUCKET_NAME, S3_LOCATION, DIR1, and DIR2 to fit your needs. Set BRANCH_NAME (e.g., BRANCH_NAME=dev ./upload.sh all) for branch-specific uploads.

Approach 2: GitHub Actions

How It Works: Define a YAML workflow in your GitHub repository (e.g., .github/workflows/deploy.yml) to upload files to S3 on code pushes.

Example Workflow:

name: Deploy to S3
on: [push]
jobs:
  deploy:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3
      - name: Configure AWS Credentials
        uses: aws-actions/configure-aws-credentials@v2
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: us-east-1
      - name: Upload to S3
        run: aws s3 sync ./artifacts/ s3://<bucket-name>/<s3-location>/

Trade-Offs: Offers automation tied to Git events but requires a public or private GitHub repo and secret management.

Approach 3: Jenkins Pipeline

How It Works: Configure a Jenkins pipeline (e.g., via a Jenkinsfile) to poll a Git repo and upload artifacts to S3.

Example Pipeline:

pipeline {
    agent any
    stages {
        stage('Checkout') {
            steps {
                git branch: 'master', url: '<repo-url>'
            }
        }
        stage('Upload to S3') {
            steps {
                sh 'aws s3 sync ./artifacts/ s3://<bucket-name>/<s3-location>/'
            }
        }
    }
}

Comparing Alternatives

Trade-Offs: Provides enterprise-grade features but demands server setup and maintenance, making it overkill for simple use cases.

While the Bash script is a lightweight, cost-free solution, other approaches can achieve the same goal. Here’s a comparison of three methods:

ApproachDescriptionProsConsBest For
1. Bash ScriptA shell script using AWS CLI to upload files from local directories to S3.– Zero cost
– Highly customizable
– Runs locally with minimal setup
– Fast execution
– Limited scalability
– Manual execution
– No built-in CI/CD triggers
Solo devs, small teams, quick tests
2. GitHub ActionsA workflow in GitHub Actions to automate S3 uploads on push or pull requests.– Free tier available
– Integrates with Git
– Scalable with triggers
– UI dashboard
– Requires GitHub repo
– Learning curve for YAML
– Internet dependency
Teams using GitHub, CI/CD beginners
3. Jenkins PipelineA Jenkins job to automate S3 uploads, triggered by SCM polling or webhooks.– Robust and scalable
– Extensive plugin support
– On-premises option
– Setup complexity
– Resource-intensive
– Maintenance overhead
Enterprises, complex workflows

Key Benefits of the Bash Approach

  1. Time Efficiency: Automates uploads in seconds, ideal for rapid testing.
  2. Cost Savings: No paid tools or cloud instances—just Bash and AWS CLI.
  3. Simplicity: Minimal setup, perfect for quick wins or resource-constrained environments.

Compared to GitHub Actions and Jenkins, it lacks native CI/CD triggers but excels in simplicity and cost.

Conclusion

The Bash script offers a fast, free, and flexible way to automate S3 uploads, serving as a “poor man’s CI/CD pipeline” for small-scale needs. While GitHub Actions and Jenkins provide more robust automation, the script’s simplicity makes it a compelling choice for quick deployments or learning projects. Whatever your context, understanding these options equips you to tailor solutions to real-world challenges—a skill any engineering team would value.

Read Full Post »

Design a site like this with WordPress.com
Get started