Creating a Jenkins Pipeline with Docker as an Agent: A Step-by-Step Guide

RachanaRachana
5 min read

What is Jenkins?

Jenkins is a popular pipeline orchestration platform, used to automate all the stages between checking out the code to deploying it on a desired platform. These stages may include static code analysis, unit testing, SAST/DAST etc.

Earlier, the working structure of Jenkins would consist of a Jenkins Master node, mostly used for scheduling the tasks on the Jenkins Worker nodes. These worker nodes configured to a particular Master node were often Virtual Machines and would perform tasks as mentioned in the scripts.

The worker nodes were categorized with specific roles: VM-1 for database-related configurations, VM-2 for the back-end, etc. However, this architecture has some drawbacks. Firstly, all these VMs are continuously running, even when not in use, leading to inefficient use of resources and higher costs. Additionally, making changes, such as updating the version or configuration of Nodejs, requires logging into each VM individually, which is not operationally efficient.

This problem can be solved by using docker as an agent on the Jenkins Master Node. And in this blog, I'll be demonstrating practically on how this approach is better.

Pre-requisites

  1. Start by creating an EC2 instance on AWS, of the type Ubuntu, t2.micro, with auto-assigned public IP and a key-pair to be able to SSH.

  2. SSH into the EC2 instance - ssh -i <path-to-the-key-pair> ubuntu@<public-ip-of-the-instance>

  3. Follow these steps to install Java(v17/v21), as it is a pre-requisite to install Jenkins. And install Jenkins too.

    • sudo apt update

    • sudo apt install openjdk-17-jdk

    • java -version to verify if it installed

    • curl -fsSLhttps://pkg.jenkins.io/debian/jenkins.io-2023.key| sudo tee /usr/share/keyrings/jenkins-keyring.asc > /dev/null echo deb [signed-by=/usr/share/keyrings/jenkins-keyring.asc]https://pkg.jenkins.io/debianbinary/ | sudo tee /etc/apt/sources.list.d/jenkins.list > /dev/null

      sudo apt-get update

      sudo apt-get install jenkins -> this is for Ubuntu instances only.

    • sudo systemctl status jenkins -> to check if jenkins is active and running.

  4. Jenkins is accessible on port 8080, so we have to edit the inbound rules of the security group attached to the EC2 instance to allow custom TCP traffic on port 8080

  5. Login to Jenkins using the URL - http://<public-ip-of-the-instance>:8080

    sudo cat /var/lib/jenkins/secrets/initialAdminPassword to get the initial admin password to login.

    And install the suggested plugins-

    After installation, create the First Admin User and remember the credentials to be able to login again.

    Now, Jenkins set up is complete and ready to use!

  6. Since we are using Docker as the agent on the Master Node, which is the EC2 instance, install Docker on it

    • sudo apt update

    • sudo apt install docker.io

    • Grant permissions to the Jenkins and Ubuntu users to run Docker commands

      sudo su -

      usermod -aG docker jenkins

      usermod -aG docker ubuntu

      systemctl restart docker

  7. Install the docker pipeline plugin on Jenkins as well

    • Go to Manage Jenkins > Manage Plugins.

    • In the Available tab, search for "Docker Pipeline".

    • Select the plugin and click the Install button.

  8. After these installations, restart Jenkins for the changes to be reflected and log back in with the credentials created.

    http://<public-ip-of-the-instance>:8080/restart

Understanding the Jenkinsfile

It is crucial to understand the Jenkinsfile before starting to build the pipeline. The Jenkins file used here is very simple, but can be improvised to host real applications too. It is the syntax that matters.

As we can see, groovy scripting is used to make the Jenkins Pipeline.

  • agent must be specified at the beginning to the pipeline when there is only one stage to be executed. Here, just for the demo purposes, the script consists of one stage only('Test')

  • First, a container is created form the node:16-apline docker image and the task is run on that environment only.

  • To confirm that the node image is being used, check for its version in the first stage.

  • As seen here, a stage can consist of many steps too. I will be making a continuation blog explaining multiple stages using multiple docker agents too.

Building the Pipeline

Now that everything required is configured, lets build our pipeline!

  1. Click 'New Item' in the Dashboard, select the Pipeline , for orchestration and name the pipeline.

  2. Configure the SCM for the pipeline, which is https://github.com/RachanaVenkat/Jenkins-docker-as-agent

    The 'Script Path' is just Jenkinsfile here, as it is present in the root of the directory. If not, the path must be specified.

  3. Click on 'Build Now' of the Jenkins job and the pipeline execution will begin. The whole process can be viewed in detail in the Console Output.

  4. Jenkins follows the Jenkinsfile and first fetches the code from the GitHub repository.

  5. Then it checks if there is an existing docker image to run the task. As it doesn't exist, Jenkins pulls the Docker image mentioned in the Jenkinsfile, builds a container and checks for its version.

  6. The most crucial thing to notice from the last image is, after checking for the version, Jenkins stops the container and removes all the volumes attached to it. Again when a new build is triggered, a new container is spinned up a and it is removed as soon as the build is complete.

    It can be noticed that this whole process took only 39 secs to finish, which is very efficient.

So this is how the ephemeral nature of the container is advantageous making the whole process cost efficient and resource efficient, unlike when VMs are configured as Worker Nodes. Additionally, now if you have the requirement to change the version of nodejs, all you to do is update the image used as the agent in the Jenkinsfile.

Thankyou all, have a good day:)

1
Subscribe to my newsletter

Read articles from Rachana directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Rachana
Rachana

Hi to the fellow tech enthusiasts out there! πŸ‘‹ I am an aspiring Cloud and DevOps Engineer ☁️ With strong foundation in containerization technologies like Docker and Kubernetes🐳 Capable of building resilient, secure and cost optimized infrastructure on AWS cloud - AWS SAA certified.β˜οΈπŸ”’ Currently learning to build CI/CD pipelines using Jenkins, github-actions, AWS CodePipeline, and many more.πŸ› οΈπŸ”„ Exploring other tools like Ansible for configuration management and Terraform for Infrastructure as Code(IaC).πŸ§©πŸ“œ Let's connect, learn and grow together! 🌟🀝