Building a CI/CD Pipeline Using AWS CodeBuild, CodePipeline & CodeDeploy

Titus JamesTitus James
4 min read

When I started building and deploying containerized applications, I quickly realized that manually pushing Docker images and SSH-ing into servers wasn’t going to scale. That’s when I decided to set up a CI/CD pipeline using AWS tools — and honestly, it felt a little overwhelming at first.

In this blog, I’ll walk you through how I built a pipeline that:

Picks up code from GitHub
Builds a Docker image and pushes it to Docker Hub
Automatically deploys the app to an EC2 instance using CodeDeploy


What We’re Building

Here's the big picture:

GitHub → CodePipeline → CodeBuild (Docker build + push) → CodeDeploy → EC2
  • CodePipeline automates the workflow

  • CodeBuild builds and pushes the Docker image

  • CodeDeploy handles deploying that image to an EC2 instance


🔨 Step 1: Create a Build Project with CodeBuild

The first step is to set up a build project in AWS CodeBuild that can build and push our Docker image.

Here’s what I did:

  • Gave my project a name

  • Selected GitHub as the source and connected my repo

  • Chose a managed Ubuntu image for the build environment

  • Enabled privileged mode (this is a must for Docker builds)

Then I created a new IAM role for CodeBuild and attached these permissions:

  • AWSCodeBuildAdminAccess

  • AmazonSSMFullAccess (we’ll use this to store Docker credentials securely)


Step 2: Store Docker Credentials Securely

Instead of hardcoding Docker Hub credentials (which is a bad practice in the security point of view), I stored them using AWS Systems Manager Parameter Store.

I created three SecureString parameters:

  • /myapp/docker-credentials/username

  • /myapp/docker-credentials/password

  • /myapp/docker-credentials/url

These will be injected as environment variables into our build process.


Step 3: Writing the buildspec.yml

This YAML file tells CodeBuild what to do. Here’s what mine looked like:

version: 0.2

env:
  parameter-store:
    DOCKER_REGISTRY_USERNAME: /myapp/docker-credentials/username
    DOCKER_REGISTRY_PASSWORD: /myapp/docker-credentials/password
    DOCKER_REGISTRY_URL: /myapp/docker-registry/url
phases:
  install:
    runtime-versions:
      python: 3.11
  pre_build:
    commands:
      - echo "Installing dependencies..."
      - pip install -r requirements.txt
  build:
    commands:
      - echo "Running tests..."
      - echo "Building Docker image..."
      - echo "$DOCKER_REGISTRY_PASSWORD" | docker login -u "$DOCKER_REGISTRY_USERNAME" --password-stdin "$DOCKER_REGISTRY_URL"
      - docker build -t "$DOCKER_REGISTRY_URL/$DOCKER_REGISTRY_USERNAME/simple-python-flask-app:latest" .
      - docker push "$DOCKER_REGISTRY_URL/$DOCKER_REGISTRY_USERNAME/simple-python-flask-app:latest"
  post_build:
    commands:
      - echo "Build completed successfully!"
artifacts:
  files:
    - '**/*'
  base-directory: ./

Step 4: Automate with CodePipeline

Now that we can build and push a Docker image, we need automation. That’s where CodePipeline comes in.

Steps I followed:

  1. Created a new pipeline with a new service role

  2. Set GitHub (v2) as the source

  3. Connected my repo and selected the branch

  4. Selected the CodeBuild project I just made as the build provider

  5. Skipped the deployment stage (for now)

That’s it. Now every time I push code to GitHub, CodePipeline triggers a build.


Step 5: Set Up CodeDeploy and EC2

Next, I wanted to deploy the Docker container to a real server.

Launching the EC2 instance:

  • Used the default VPC

  • Enabled auto-assign public IP

  • Added a tag: Name=codedeploy-instance

Installing CodeDeploy Agent:

I SSHed into the EC2 instance and followed this AWS guide.
Be sure to:

  • Replace the bucket name and region

  • Use sudo systemctl start codedeploy-agent and check the status

Installing Docker:

sudo apt update
sudo apt install docker.io -y

I also created a new IAM role with EC2 permissions and attached it to the instance so CodeDeploy can talk to it.


Step 6: Create CodeDeploy Application + Deployment Group

Inside CodeDeploy:

  1. Created an application (type: EC2/on-premises)

  2. Created a deployment group:

    • Created a new IAM role, for which i gave EC2FullAccess, EC2RoleForAWSCodeDeploy and S3FullAccess(I learned this the hard way after spending several hours testing..😕)

    • Chose "in-place" deployment

    • Selected the EC2 instance by tag (Name=codedeploy-instance)

    • Skipped load balancing

If everything’s good, you’ll see a green “1 matching instance” message.


Step 7: Connect CodeDeploy to CodePipeline

Now I went back to CodePipeline and added a new stage called Deploy.

  • Added a CodeDeploy action

  • Chose the application and deployment group

  • Set the input artifact to BuildArtifact (from CodeBuild)

Hit save. Now the full loop is ready!


Test the Pipeline

I made a small code change and pushed to GitHub.
In seconds:

  • CodePipeline detected the change

  • CodeBuild built and pushed a new Docker image

  • CodeDeploy pulled and ran the container on my EC2

Then I logged into Docker Hub to confirm — and yes, the image was there!


One Last Thing: Don't Forget appspec.yml

For CodeDeploy to work, your GitHub repo must contain an appspec.yml file in the root folder.
It defines how CodeDeploy installs and runs your app on the instance.


Wrap Up

This setup is now fully automated:

  • I push code → Docker image builds → It gets deployed to EC2

  • No manual steps. No forgotten commands. Just clean DevOps flow.

Setting this up took some trial and error (especially with IAM roles and Docker permissions), but now it saves me hours of deployment work.

0
Subscribe to my newsletter

Read articles from Titus James directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Titus James
Titus James