Streamlining Your Software Delivery Pipeline with AWS CodeBuild and CodePipeline

Amit ParadAmit Parad
10 min read

Streamlining Your Software Delivery Pipeline with AWS CodeBuild and CodePipeline

Please refer to the above diagram. Only one thing over here is that we will be using the Github Repo instead of the AWS Code commit.

In this tutorial only the CI part will be covered.

The first step would be if you need more information then go to the repository https://github.com/AmitP9999/aws-devops-zero-to-hero/tree/main/day-14 and check repo AWS devops zero to hero, day14. Readme files having each and every step that we will be going to implement. & having a simple-python-app flask based application as simple as the hello world application. Then we have a requirement.txt file. Usually have a python application.

Also we will come to know how to create the docker image & how to push this docker image to the docker hub. That is why if you check in the repository there is a docker file related to python. This is the very very basic file.

There is buildspec.yaml so it's been just created for the reference. We will write the file step by step and will do it practically again because we will implement the CI and build the application.

You guys just check the repository to understand more.

So as per the diagram we will start with AWS CodeBuild instead of AWS Codecommit we are using the Github repository.

Because once we start with the CodeBuild, if we start with the CodeBuild we will be able to build the project with all the steps like Code Check out, Unit Testing, Image building and then image pushing.

So we can initiate the first step by going to the AWS console and click the AWS CodeBuild.

Then Click on Create Project.

We don't have to enable the below options.

What if we are going to Enable concurrent build limits?

If there are Developers who have push the multiple commit to the github repo or aws code commit in such cases do you want to restrict any number of concurrent builds that means 100 developers are triggering code changes then if we want to run 100 pipelines at once or do we want to restrict the number that want to run at once as of now we don't want to restrict anything over here.

Then we will move to the Source 1- Primary thing.

We have to mention the source provider over here. In our case we are using GitHub.

Above are the supported ones. So we can select GitHub.

So as per the above image we have not been connected to the GitHub. We should connect to our GITHUB Repo first. Click on Manage default source credential.

Click on Connect to GitHub.

Click Confirm.

Now the option has been changed.

So GitHub connection has been already established and there is no need to modify other options as of now.

Now coming to the environment section. Environment would be a virtual machine image or a Docker Image. Aws offers a preset of these managed images.

We have taken Ubuntu for a better understanding. This is the environment where AWS has provided us. When we are using Jenkins we have created the worker node on top of that worker node we installed some OS or EC2 instances. And will ask jenkins to run the jenkins pipeline on this environment. Let's take an example that we are using two worker nodes so that will tell jenkins to use one of the worker nodes and execute the pipeline.

In the same way AWS code build provided us the readymade images so we can run our codebuild on those images.

In the Image section always take the Latest Image.

So coming to the Service Role section. To perform action on AWS it will need a Service Role.

Why are we using the service Role over because CodeBuild is the service and it would be going to perform the action.

We don't want to modify anything in additional configuration.

Coming to the Buildspec that is very important.

Buildspec file is the same as we write in GitHub actions or in jenkins files. As per the above diagram the stages we have to execute we have to write some code. For an example we need to write the code checking out the code, Image creation and so on that thing we have to write in the AWS Buildspec file.

We can use a first option Insert build commands & click on switch to editor so its allowed to write the file over there.

So We will write this file through the Editor.

So can start with the phase stage instead of adding any environment variables.

Then before moving to build we can add the pre_build that is prerequisite

Even before we install the application we have to install a flask on my EC2 instance.

We can check requirement.txt in the Github repo.

So we have done with the pre-build entry in the file.

In that Pre-build command we have to give the absolute path of that particular file. It will basically install the flask on the image.

Then next move on with the build stage.

So as per the architect diagram the first step is check out the code, in this stage the check out of the code happened automatically as we already integrated the Github Repo. Github repo code check out stage will be taken care of automatically.

Then move to the second stage i.e. build command stage

As we checked in that file we don't have any sensitive information i.e. username and password next to the docker build-t <image tag> command. For the image or docker username and password we cannot punt we will use AWS Systems Manager and store our credentials in the AWs system manager like in Jenkins we will use in the Jenkins Credentials store. Or we can use some external tools like hashicorp vault.

In this file we can put the password or username or any sensitive information. Anyone who opens this file they are able to see the information that would compromise the security.

Above the docker build and docker push command kept incomplete. Because we will be going to see how the AWS system manager stores the credential information.

So we are only implementing CI Part over here. We are not going to implement the CD Part so we don't have to add any command in the post build section in buildspec file.

Once done then click on Create Build Project.

Now we will see the demo on how to store the sensitive information.

Go to the AWS System Manager.

Go to the Parameter Store at left side pane.

Click on Parameter store , Then click on Create Parameter.

Type should always select the “Securestring”

We can put the value as well.

& then click on Create parameter.

So we have created the below parameters:

Then go to the Buildspec.yaml & Edit the file. In the Project come below into in the Buildspec then click Edit.

So we will uncomment the environment variables first.

We have added the below variables in the file. The all below values have been taken from the AWS System manager where we had created the Parameters for each.

To keep in mind there will be minor changes in the code. Instead of the values we will put below.

Then will go to the docker commands 👍

Docker build and push has been completed. So now let's try to implement this entry with CI with AWS Code build . Update the Build spec. And click on Update buildspec

Then click on start build.

We have got the following error message.

The error message indicates that the CodeBuild project is unable to access the parameter my-app/docker-credentials/password in AWS Systems Manager (SSM) due to insufficient permissions.

To resolve this issue, we need to grant the CodeBuild project the ssm:GetParameters permission on the specified parameter.

So we will go into in the IAM role.

Click on the role which we created while the build.

Click on add permissions and then attach policies & select the Amazon SSM full access.

Then click on Add Permissions.

So we will start to Start build Again we will see what would be the results?

Now we have got the below error message.

We have rectified the error above that in that buildspec.yaml file we have put in the environment variable my-app instead of we have to put the myapp only. So we can change that again Rebuild it.

So we have changed the environment variables as below:

Also with the above changes we can put the password in the value tab on the AWS Console. So clicked on Edit Changes.

Then click on Save changes.

Again we will retry the build. Now we have got the new error message.

So we found the issue.

We have to correct it in builspec.yaml file . Again go to the build and edit the file, Then we will rebuild it. We have corrected the spelling and updated the Buildspec. Post rebuild we got the result below:

Again in the Buildspec we have to add docker login command for the authentication purpose.

This is how we have implemented using AWS Code Build & We can deploy the Docker image to Docker hub. Now we can integrate with the AWS codepipeline.

Go to the AWS Codepipeline.

Click on Create Pipeline. So with the Codepipeline we came to know who did the changes in the github repo.

Give the name to the pipeline.

Click Next.

Then Select the Github (Version1) , Post that click in the Connect to GitHub.

GitHub Version 1 is not recommended so I had been selected the GitHubversion2

Click on Connect to GitHub.

Give the Connection Name

Then click on Connect to GitHub.

So this is very important that Github Repo or aws codecommit should be connect because whenever there is code change it has send to request to the pipeline and pipeline has to invoke.

Give the repo name and select the branch as Main.

Select the Output format as below:

Then Click on Next.

Unfortunately The project which I had created it has been deleted accidentally so I’m unable to add the project over here.

If the Project is already there it should be coming in the drop down list.

If we can do some changes within our code from the GitHub repo then we will be able to know that the code has been changed. So this is the advantage of the AWS Pipeline. It automatically invokes the AWS Codebuild whenever there is a code change in github repo or AWS Code Commit.

Benefits of Using AWS CodePipeline:

Faster Time-to-Market: Automate your software delivery process and accelerate deployment cycles.

Improved Quality: Ensure consistent and reliable deployments with automated testing and validation.

Increased Efficiency: Reduce manual effort and errors by automating repetitive tasks.

Enhanced Collaboration: Improve collaboration among development teams by providing a centralized view of the software delivery process.

Scalability: Easily scale your pipelines to handle growing workloads and increasing complexity.

Typical Use Cases for AWS CodePipeline:

Continuous Integration and Continuous Delivery (CI/CD): Automate the build, test, and deployment of applications.

Microservices Deployments: Manage and deploy microservices architectures efficiently.

Infrastructure as Code (IaC): Automate the provisioning and management of infrastructure resources.

DevOps Practices: Implement DevOps principles and streamline the software development lifecycle.

How CodePipeline Works:

Source Stage: Trigger the pipeline based on changes in a source code repository (e.g., CodeCommit, GitHub, BitBucket).

Build Stage: Build the application using tools like CodeBuild or custom scripts.

Test Stage: Test the application using automated tests (e.g., unit tests, integration tests).

Deploy Stage: Deploy the application to various environments (e.g., development, staging, production).

By leveraging the power of AWS CodePipeline, you can streamline your software development process, improve quality, and accelerate time-to-market.

Thank You !!

Happy Learning !!

0
Subscribe to my newsletter

Read articles from Amit Parad directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Amit Parad
Amit Parad

Experienced Cloud / DevOps Engineer with a passion for automating infrastructure and streamlining software delivery processes. Skilled in AWS, Docker, Kubernetes, CI/CD pipelines, Ansible, Terraform & Jenkins. Proven ability to collaborate with development, operations, and QA teams to ensure efficient and reliable deployments.