Creating a Jenkins Pipeline with Docker as an Agent: A Step-by-Step Guide


What is Jenkins?
Jenkins is a popular pipeline orchestration platform, used to automate all the stages between checking out the code to deploying it on a desired platform. These stages may include static code analysis, unit testing, SAST/DAST etc.
Earlier, the working structure of Jenkins would consist of a Jenkins Master node, mostly used for scheduling the tasks on the Jenkins Worker nodes. These worker nodes configured to a particular Master node were often Virtual Machines and would perform tasks as mentioned in the scripts.
The worker nodes were categorized with specific roles: VM-1 for database-related configurations, VM-2 for the back-end, etc. However, this architecture has some drawbacks. Firstly, all these VMs are continuously running, even when not in use, leading to inefficient use of resources and higher costs. Additionally, making changes, such as updating the version or configuration of Nodejs, requires logging into each VM individually, which is not operationally efficient.
This problem can be solved by using docker as an agent on the Jenkins Master Node. And in this blog, I'll be demonstrating practically on how this approach is better.
Pre-requisites
Start by creating an EC2 instance on AWS, of the type Ubuntu, t2.micro, with auto-assigned public IP and a key-pair to be able to SSH.
SSH into the EC2 instance -
ssh -i <path-to-the-key-pair> ubuntu@<public-ip-of-the-instance>
Follow these steps to install Java(v17/v21), as it is a pre-requisite to install Jenkins. And install Jenkins too.
sudo apt update
sudo apt install openjdk-17-jdk
java -version
to verify if it installedcurl -fsSL
https://pkg.jenkins.io/debian/jenkins.io-2023.key
| sudo tee /usr/share/keyrings/jenkins-keyring.asc > /dev/null echo deb [signed-by=/usr/share/keyrings/jenkins-keyring.asc]
https://pkg.jenkins.io/debian
binary/ | sudo tee /etc/apt/sources.list.d/jenkins.list > /dev/null
sudo apt-get update
sudo apt-get install jenkins
-> this is for Ubuntu instances only.sudo systemctl status jenkins
-> to check if jenkins is active and running.
Jenkins is accessible on port
8080
, so we have to edit the inbound rules of the security group attached to the EC2 instance to allow custom TCP traffic on port 8080Login to Jenkins using the URL -
http://<public-ip-of-the-instance>:8080
sudo cat /var/lib/jenkins/secrets/initialAdminPassword
to get the initial admin password to login.And install the suggested plugins-
After installation, create the First Admin User and remember the credentials to be able to login again.
Now, Jenkins set up is complete and ready to use!
Since we are using Docker as the agent on the Master Node, which is the EC2 instance, install Docker on it
sudo apt update
sudo apt install docker.io
Grant permissions to the Jenkins and Ubuntu users to run Docker commands
sudo su -
usermod -aG docker jenkins
usermod -aG docker ubuntu
systemctl restart docker
Install the
docker pipeline
plugin on Jenkins as wellGo to Manage Jenkins > Manage Plugins.
In the Available tab, search for "Docker Pipeline".
Select the plugin and click the Install button.
After these installations, restart Jenkins for the changes to be reflected and log back in with the credentials created.
http://<public-ip-of-the-instance>:8080/restart
Understanding the Jenkinsfile
It is crucial to understand the Jenkinsfile before starting to build the pipeline. The Jenkins file used here is very simple, but can be improvised to host real applications too. It is the syntax that matters.
As we can see, groovy scripting
is used to make the Jenkins Pipeline.
agent
must be specified at the beginning to the pipeline when there is only one stage to be executed. Here, just for the demo purposes, the script consists of one stage only('Test')First, a container is created form the
node:16-apline
docker image and the task is run on that environment only.To confirm that the node image is being used, check for its version in the first stage.
As seen here, a stage can consist of many steps too. I will be making a continuation blog explaining multiple stages using multiple docker agents too.
Building the Pipeline
Now that everything required is configured, lets build our pipeline!
Click 'New Item' in the Dashboard, select the
Pipeline
, for orchestration and name the pipeline.Configure the SCM for the pipeline, which is
https://github.com/RachanaVenkat/Jenkins-docker-as-agent
The 'Script Path' is just Jenkinsfile here, as it is present in the root of the directory. If not, the path must be specified.
Click on 'Build Now' of the Jenkins job and the pipeline execution will begin. The whole process can be viewed in detail in the Console Output.
Jenkins follows the Jenkinsfile and first fetches the code from the GitHub repository.
Then it checks if there is an existing docker image to run the task. As it doesn't exist, Jenkins pulls the Docker image mentioned in the Jenkinsfile, builds a container and checks for its version.
The most crucial thing to notice from the last image is, after checking for the version, Jenkins stops the container and removes all the volumes attached to it. Again when a new build is triggered, a new container is spinned up a and it is removed as soon as the build is complete.
It can be noticed that this whole process took only 39 secs to finish, which is very efficient.
So this is how the ephemeral nature of the container is advantageous making the whole process cost efficient and resource efficient, unlike when VMs are configured as Worker Nodes. Additionally, now if you have the requirement to change the version of nodejs, all you to do is update the image used as the agent in the Jenkinsfile.
Thankyou all, have a good day:)
Subscribe to my newsletter
Read articles from Rachana directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Rachana
Rachana
Hi to the fellow tech enthusiasts out there! π I am an aspiring Cloud and DevOps Engineer βοΈ With strong foundation in containerization technologies like Docker and Kubernetesπ³ Capable of building resilient, secure and cost optimized infrastructure on AWS cloud - AWS SAA certified.βοΈπ Currently learning to build CI/CD pipelines using Jenkins, github-actions, AWS CodePipeline, and many more.π οΈπ Exploring other tools like Ansible for configuration management and Terraform for Infrastructure as Code(IaC).π§©π Let's connect, learn and grow together! ππ€