How To Build a CI/CD Pipeline with AWS CodeBuild, CodeCommit, and CodeDeploy

Jamillah BelloJamillah Bello
18 min read

Introduction

Continuous Integration and Continuous Deployment (CI/CD) pipelines are essential for modern software development, enabling automated code testing, building, and deployment. AWS offers a suite of services, AWS CodeBuild, AWS CodeCommit, and AWS CodeDeploy, that streamline the CI/CD process, reducing manual effort and improving deployment efficiency.

In this guide, we will walk through setting up a fully automated CI/CD pipeline to deploy an application to AWS Elastic Beanstalk. The pipeline will be configured to automatically build and deploy application updates whenever changes are pushed to a Bitbucket repository.

Initially, our codebase is hosted on GitHub, but we will migrate it to Bitbucket to align with our pipeline requirements. Bitbucket provides a seamless integration with AWS, allowing us to trigger builds and deployments efficiently. We will use AWS CodeCommit as an intermediate step to facilitate integration between Bitbucket and AWS services.

Additionally, we will create and initialize our database, ensuring the application has a properly configured data layer before deployment. This step will involve:

  • Setting up a managed database service (Amazon RDS)

  • Running database migrations and seeding initial data

What You’ll Learn in This Guide:

  • Migrating Code from GitHub to Bitbucket – Moving an existing repository while preserving version history.

  • Setting Up AWS CodeCommit – Acting as a bridge between Bitbucket and AWS services.

  • Configuring AWS CodeBuild – Automating the build process to compile, test, and package the application.

  • Setting Up AWS CodeDeploy – Automating deployments to AWS Elastic Beanstalk.

  • Creating a Database and Initializing It – Provisioning a database and running migrations/seeding.

  • Creating an AWS CodePipeline – Automating the entire CI/CD workflow.

  • Testing the CI/CD Pipeline – Ensuring seamless integration and automatic deployments.

By the end of this guide, you will have a fully functional CI/CD pipeline that automates application deployment to AWS Elastic Beanstalk, along with a properly configured database, ensuring a complete and efficient deployment setup.


CI/CD Project

S/NProject Tasks
1Create a key pair
2Set up Elastic Beanstalk
3Create an RDS database
4Initialise database
5Setup Bitbucket
7Set up Code Build
8Set up Code Pipeline
9Test Setup

Checklist

  • Task 1: Creating a key pair

  • Task 2: Create an EC2 Instance Profile

  • Task 3: Set Up Elastic Beanstalk

  • Task 4: Create RDS Database

  • Task 5: Edit the RDS security group to allow inbound traffic from the e2 instances created by Elastic Beanstalk

  • Task 6: Initialise Database

  • Task 7: Set up the Code base

  • Task 8: Set up an S3 bucket for artifact storage

  • Task 9: Set up Code Build

  • Task 10: Set up code pipeline

  • Task 11: Test your pipeline

Project repo: jprofile

Documentation

Create a Key Pair

  • Enter 'key pairs' in the search field and select Key pairs① from the displayed options.

  • Click the Create key pair② button to proceed.

  • Provide a Name③ for the key pair and click Create key pair④.


Create an EC2 Instance Profile

  • Enter IAM① in the search bar and choose IAM② from the displayed services.

  • Navigate to Roles③ by clicking on it.

  • Click the Create role④ button.

  • Click the chevron icon⑤, then select EC2⑥ from the dropdown menu.

  • Click on Next⑦ to proceed.

  • In the search field, search for bean⑧, and select the four policies shown in the image:

    • AdministratorAccess-AWSElasticBeanstalk: This policy grants full administrative access to AWS Elastic Beanstalk and its associated resources. It allows a user or role to create, update, and delete Elastic Beanstalk applications, environments, and configurations. It also provides permissions to manage EC2 instances, load balancers, auto-scaling groups, and other resources Elastic Beanstalk provisions.

    • AWSElasticBeanstalkCustomPlatformforEC2Role: This policy is designed for EC2 instances that run custom Elastic Beanstalk platforms. It provides the necessary permissions for instances to download platform components, manage logs, and interact with other AWS services needed to run a custom platform.

    • AWSElasticBeanstalkRoleSNS: This policy allows Elastic Beanstalk to send notifications via Amazon Simple Notification Service (SNS). It enables Elastic Beanstalk to publish messages related to application and environment events, such as deployment statuses or failures, which can then trigger alerts or automated actions.

    • AWSElasticBeanstalkWebTier: This policy is intended for instances in the web tier of an Elastic Beanstalk environment. It grants permissions for web servers to interact with Elastic Beanstalk, manage logs, and perform basic operations required for hosting web applications. It ensures that web-tier instances can function properly within the Elastic Beanstalk-managed infrastructure.

  • Click Next⑨ to proceed.

  • Provide a Role name⑩ and Description⑪.

  • Click Create role⑫ to finalize the process.


Set up your Elastic Beanstalk Environment

Configure environment

  • Enter 'beanstalk' in the search field, then select Elastic Beanstalk① from the search results.

  • Click Create application② to begin the process.

  • Enter your Application name③, and in the Environment information section, provide your Environment name④ and Domain⑤, then check availability⑥.

Note:

It is essential that the domain name is unique, since it will be used to construct the URL.

  • Since our app runs on Tomcat, click the chevron icon⑦ and select Tomcat⑧ as the platform.

  • Choose custom configuration⑨ in the Presets section, then click Next⑩.

Configure service access

  • For service roles, select Create and use new service role①, then click the chevron down icon② and choose your created key pair.

  • Click within the empty field to select your created EC2 instance profile③, and proceed by clicking Next④.

Set up networking, database, and tags

  • Click the chevron icon① to open the dropdown menu, then select the default vpc②.

  • Ensure the public IP address is Activated③ and select all Availability Zones④.

  • Click the Add new tag⑤ button.

  • Specify a Key⑥ and Value⑦ pair, and proceed by clicking Next⑧.

Configure instance traffic and scaling

  • Click on the Root volume type field, then choose General Purpose 3(SSD)① from the options.

  • Within the Auto Scaling group section, choose the Load balanced② option.

  • Specify the desired minimum and maximum number of Instances③ to be provisioned.

  • Scroll down to Instance types and change it to t2.micro④ to remain within the free tier limits.

  • Within the Processes section, click the radio button⑤ to select it, then click Actions⑥.

  • Choose Edit⑦.

  • Click the Chevron down icon⑧ next to Sessions, verify that Session stickiness is Enabled⑨, and then click Save⑩.

  • Click the Next⑪ button.

Configure updates, monitoring, and logging

  • Within the Application deployments section, click on the Deployment policy field, then choose Rolling① from the options.

  • Specify the Deployment batch size② you want to use.

Note:

For this example, we are using a Deployment batch size of 50%. However, in a production environment with multiple instances, it's recommended to select no more than 25%, ideally around 10% to deploy to one instance at a time.

  • Click the Next③ button.

  • Take a moment to review your settings, and then click Submit④.

  • Your Elastic Beanstalk environment is currently being created. This process may take some time, so while you wait, proceed to the next step.


Create an RDS Database

  • Enter RDS① in the search bar and choose Aurora and RDS② from the displayed services.

  • Click Create database③ to begin setting up a new database.

  • Select MySQL④.

  • Choose the Free tier⑤ option to keep your database usage within the free tier.

  • Provide a unique DB instance identifier⑥, then select the Auto generate password⑦ option.

  • In the VPC section, select Create new⑧ and type in your New VPC security group name⑨.

  • Click on the chevron icon⑩ next to Additional configuration.

  • Enter your desired Initial database name⑪.

Note:

It's important for this exercise that you name your initial database 'accounts' to match the configuration in our codebase.

  • Click on Create database⑫.

  • Close⑬ the pop up message.

  • Click View credential details⑭ to get the credentials for your database.

  • Copy the displayed details and save them in a secure note.


RDS Security Group Setup

  • Search for EC2 in the search bar and then select EC2① from the services.

  • Click on Instances②.

  • Make a note of the two servers provisioned by Elastic Beanstalk.

  • Choose one of the servers③ created and then click the link to its Security group④.

  • Copy the Security group ID⑤ of the EC2 instances and then go to Security Groups⑥ under the Network & Security tab.

  • Click the Security group ID⑦ of the security group created by RDS.

  • Click Edit inbound rules⑧.

  • Click the Add rule⑨ button.

  • Search and select MYSQL/Aurora⑩.

  • Paste in the security group ID⑪ of the instances you copied earlier, and click Save rules⑫.

  • Return to the Instances⑬ page.


Initialise Database

  • Select any of the instances① and copy its Public IPv4 address②.

  • Execute the following command in your terminal ssh -i <"key pair name"> ec2-user@ec2-<Public IP>.compute-1.amazonaws.com.

Note:

Make sure to replace with the exact name of your key pair file and with your Public IP address. When replacing the Public IP, substitute the dots (.) with dashes (-). For example: ssh -i "cicdbeankey.pem" ec2-user@ec2-3-90-183-88.compute-1.amazonaws.com.

  • Run the command dnf search mysql① to search for MySQL packages, and then copy the name of the mysql client name② package from the results.

  • Run the following command dnf install mariadb105.

  • Execute this command in your terminal: mysql -h <your rds endpoint> -u <your user> -p accounts. Making sure to substitute your RDS endpoint for <your rds endpoint> and your username for <your user>. When prompted, enter your password to log in to the 'accounts' database, and once you've verified that you can access it, type exit to disconnect.

  • Copy the URL④ from your browser's address bar.

  • Return to your terminal window and execute this command to download the 'db_backup.sql' file: wget https://github.com/StrangeJay/jprofile-project/blob/aws-ci/src/main/resources/db_backup.sql.

  • Execute the first command below in your terminal to import the 'db_backup.sql' file into your 'accounts' database. Make sure to replace <your rds endpoint> and <your user> With your RDS endpoint and username: mysql -h <your rds endpoint> -u <your user> -p accounts < db_backup.sql. Then, run the second command to log in: mysql -h <your rds endpoint> -u <your user> -p accounts.

  • Run the SQL command show tables; In your MySQL prompt. This will list the tables in your current database. Compare this list with the tables shown in the image to confirm they match.

Set Up Your Code Repository

  • Go to Bitbucket and create a Bitbucket account if you don't have one yet.

  • Click on Create a workspace①.

  • Provide a Name② for your workspace in the designated field, and then click the Agree and create workspace③ button.

  • Select Create repository④.

  • Fill in the necessary details and click on Create repository⑤.

Note:

When creating the repository, ensure it's empty. Do not add a README file or a .gitignore file, as we will be migrating our existing GitHub repository here.

  • To check for existing SSH keys on your system, go to your terminal and run the following command: ls .ssh/.

Note:

You can either use the existing ssh key or create a new one.

  • Run the command cat <public key name> to display your public key⑥, and then copy the displayed key.

Note:

Make sure to replace <public key name> in the command with the actual filename of your public key.

  • Click the gear icon⑦ and then select Personal Bitbucket settings⑧ from the menu.

  • Select SSH keys⑨ from the left-hand menu, and then click the Add key⑩ button.

  • In the form provided, fill in the necessary details⑪. Give your SSH key a descriptive name, paste the public key you copied earlier into the designated field, and then click Add key⑫.

  • Return to your terminal and create a configuration file. This setup will allow Git to use your private SSH key for authentication when you perform git push or git pull operations with your Bitbucket repository.

  • In your terminal, run the following commands to create and edit the SSH config file: cd ~/.ssh && nano config. Once the nano text editor opens, paste the following configuration, making sure to replace ~/.ssh/<private key name> With the actual path to your private key file:

# bitbucket.org
Host bitbucket.org
  PreferredAuthentications publickey
  IdentityFile ~/.ssh/<private key name>

  • Run the command ssh -T git@bitbucket.org in your terminal to test your SSH connection to Bitbucket. If the connection is successful, you should see a message similar to the one shown in the image.

  • Select the SSH option and then copy⑭ the displayed URL.

  • Run the following command to download the codebase to your local machine: git clone git@github.com:StrangeJay/jprofile-project.git⑮. Once the download is complete, navigate into the repository directory by running: cd jprofile-project⑯.

  • Run the command git checkout aws-ci⑰ to switch to the 'aws-ci' branch.

  • Run the command cat .git/config⑱ to view the remote repository that your local repository is currently tracking. Then, run git remote rm origin⑲ to remove the connection to the GitHub repository.

  • Go back to your repository on Bitbucket. Locate the SSH clone command⑳ and copy the portion of the URL that comes after git clone, as shown in the image below.

  • Go back to your terminal and run the following command, replacing <copied Bitbucket SSH url> with the SSH URL you just copied from Bitbucket: git remote add origin <copied Bitbucket SSH url>㉑.

  • Run the command cat .git/config again in your terminal. Verify that the [remote "origin"] section now shows the bitbucket url㉒ you just added.

  • Run the command git push origin --all㉓ to push all local branches (including 'main' and 'aws-ci') to your Bitbucket repository. After running the command, confirm in your Bitbucket repository that both branches㉔ ('main' and 'aws-ci') have been successfully pushed.

  • Go back to your Bitbucket repository to verify that the code and branches have been successfully pushed.

  • On Bitbucket, click the chevron down icon㉕ located near the 'main' branch name. A dropdown menu should appear, where you should confirm that you see both the main and aws-ci branches㉖ listed.


Create an S3 Bucket

  • Go to your AWS console, search for 'S3' in the search bar, and select S3① from the list of services.

  • Click on Create bucket②.

  • Provide a Bucket name③.

  • Click on Create bucket④ to create the bucket.

Note:

Just give your bucket a name and create it, leave every other setting on default.


Set Up Code Build

  • In your AWS console, use the search bar to find 'CodeBuild', and then choose CodeBuild① from the displayed services.

  • Click on Create project②.

  • Enter a Project name③ in the designated field.

  • In the Source provider section, click the chevron icon④ and select Bitbucket⑤ from the dropdown menu that appears.

  • Click on Manage account credentials⑥ to connect your AWS CodeBuild project to your Bitbucket account.

  • Click the chevron icon⑦ and then select OAuth app⑧ as the Credential type from the dropdown menu.

Note:

While using a Bitbucket access token is the recommended and most secure method, it's a paid feature. Therefore, as long as you are logged into Bitbucket in the same web browser, using OAuth app is a viable alternative.

  • Choose CodeBuild⑨ and then click the Connect to Bitbucket⑩ button.

  • Click the Confirm⑪ button to finalize the connection.

  • You should see a confirmation message indicating that your Bitbucket account has been successfully connected.

  • Click in the empty field below 'Repository'. A dropdown menu will appear; select your Bitbucket repository⑫ from the list.

  • In the 'Source version' field, type the branch name aws-ci⑬.

  • Choose Ubuntu⑭ as the operating system for your build environment.

  • Scroll down the page until you find the 'Buildspec' section, and then click the Switch to editor⑮ button.

  • Either download the buildspec.yml file to your computer or open it directly in your browser. Once you have access to its contents, copy everything and paste it into the text editor on the AWS CodeBuild page. Next, find and replace the placeholders or fields that need your specific configuration.

Note:

This command consists of three sed (stream editor) operations that modify the application.properties file located at src/main/resources/application.properties. Each command uses the sed -i option to perform an in-place substitution, meaning the file is modified directly without creating a new one.

- sed -i 's/jdbc.password=admin123/jdbc.password=nr1mTWY6OvlLBovvmZpD/' src/main/resources/application.properties
- sed -i 's/jdbc.username=admin/jdbc.username=admin/' src/main/resources/application.properties
- sed -i 's/db01:3306/vprodb.c50sgqqusvnr.us-east-1.rds.amazonaws.com:3306/' src/main/resources/application.properties

These sed commands modify the application.properties file in src/main/resources/ by updating database credentials and connection details. The first command replaces the database password, changing jdbc.password=admin123 to jdbc.password=nr1mTWY6OvlLBovvmZpD, where nr1mTWY6OvlLBovvmZpD is a placeholder and should be replaced with your actual password. The second command attempts to update the database username, but since the replacement value is the same (admin), no actual change occurs. The third command updates the database host, replacing db01:3306 with vprodb.c50sgqqusvnr.us-east-1.rds.amazonaws.com:3306, which is an AWS RDS endpoint. All placeholders, including the password and database host, should be replaced with your actual connection details before running these commands. Study this image to see how it should look.

  • Copy the complete content of the buildspec.yml file. Then, go back to your AWS console and paste this entire content into the 'Build commands' field.

  • Scroll down to the 'Artifacts' section. Under 'Type', select Amazon S3⑯ from the available options.

  • Click in the 'Bucket name' field. A dropdown menu will appear; select the bucket⑰ you created earlier from that list.

  • Enter a Group name⑱ and a Stream name prefix⑲ for your CloudWatch logs. Once you've done this, click the Create build project⑳ button.

  • Now that your project is created, click the Start build㉑ button to begin the build process.

  • Check the build status to confirm that it shows 'Succeeded'.


Set Up Code Pipeline

  • In your AWS console, use the search bar to find 'CodePipeline', and then choose CodePipeline① from the displayed services.

  • Click the Create pipeline② button.

  • Select the Build custom pipeline③ option and then click the Next④ button.

  • Enter a Pipeline name⑤ and then click the Next⑥ button.

  • Click on the 'Source provider' field. A dropdown menu will appear; select Bitbucket⑦ from that list.

  • Click the Connect to Bitbucket⑧ button to link your Bitbucket account to CodePipeline.

  • Enter a Connection name⑨ in the provided field and then click the Connect to Bitbucket⑩ button.

  • Click on Install a new app⑪.

  • Click the Grant access⑫ button.

  • Click on Connect⑬ to complete the connection.

  • Click in the 'Repository name' field. A dropdown menu will appear; select your Bitbucket repo⑭ from that list.

  • Click on the Default branch field and select aws-ci⑮ from the dropdown menu.

  • Click Next⑯.

  • Choose Other build providers⑰. Next, locate the empty field underneath. Click on it to open a dropdown list, and then select AWS CodeBuild⑱ from the options.

  • Click in the Project name field and select your created build project⑲.

  • Click the Next⑳ button to proceed to the next step.

  • Choose AWS CodeBuild㉑ as the test provider for your pipeline.

Note:

This step is optional and can be skipped if you prefer.

  • Click in the Project name field and select your build project㉒.

  • Click the Next㉓ button.

Note:

Select ‘Source artifact’ as your input artifact. This ensures that your tests run against the original source code, which includes all necessary files and folder structure (like test directories and dependency manifests). Using the ‘Build artifact’ may lead to errors if it lacks required test files or is in a packaged format unsuitable for testing.

  • In the deployment stage, choose AWS Elastic Beanstalk㉔ as the Deploy provider from the available options.

  • Click on the 'Application name' field and select your desired Application name㉕ from the options provided.

  • Select your Environment name㉖.

  • Click Next㉗.

  • Take a moment to review all the pipeline configurations you've set up. Once you're satisfied, click the Create pipeline㉘ button.

  • Verify that your pipeline has been successfully created. Once confirmed, wait for the pipeline execution to finish.

  • If the setup was performed correctly, all stages of your pipeline should show a successful result, indicated by green tick icons.


Test the Entire Set Up

  • Go to your Elastic Beanstalk environment page. There, click on the Domain name① link to open your deployed website.

  • If your setup was done right, a webpage should have loaded. On this page, please enter Admin_vp as your Username and also Admin_vp as the Password to log in.

  • Congratulations! Your application has been successfully deployed.

  • To verify if the pipeline triggers automatically as expected, open your terminal and connect to the repository where your code is stored.

  • Make a small modification to one of your code files and then push this change.

  • You should now observe that your project has automatically started building again in response to your code push.


And with that, the project is complete. You have successfully created an AWS pipeline.

0
Subscribe to my newsletter

Read articles from Jamillah Bello directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Jamillah Bello
Jamillah Bello

Technical Writer and certified AWS Solutions architect, with a background in DevOps. I enjoy learning about new concepts and simplifying their usage in documents.