GitHub Actions Certification Prep - Part 4

Cache Dependencies

In our previous workflows, we saw that installing dependencies mostly took the major portion of the time

As we are installing the same dependencies from the package.json file, we can cache them, so every time we run the job in a different device (macos, ubuntu, windows), we don’t need to install from scratch.

It supports various type of package managers. For our case, it’s a nodejs application and we are using npm.

So, let’s use it for the npm package manager

We have applied that to the unit-testing job. The dependencies are going to be installed from package-lock.json and will be kept within the node_modules folder

Let’s push it now!

Now, we can see different caches, we have just downloaded

If you now, trigger the workflow by pushing another demo change, the workflow runs again.

This time, you can see the devices used caches and installing took less time than before

So, that’s our success to reduce the time for the workflow run.

Invalidate Cache

What if you had to add more dependency later on? So, you can make changes to the package.json file and build it.

Then the file in package-lock.json will change as well. But, if you run the workflow, as we already have the caches, it won’t install the new dependency from the package-lock.json

Let’s do this build in our local pc.

i haven’t committed yet, so no changes in my local package.json file

let’s make the changes locally

Automatically this was created

The package-lock.json was also modified automaatically

Let’s commit and push it

Note: we could have install the dependency in GitHub Codespace as well instead of local machine

Let’s push this as well

A new workflow has been triggered (as the workflow was supposed to work on commit pushes)

This time, for the Cache NPM Dependency, we could see some error

It was expected as while creating the cache there is a new key generated which is different than the older cache’s key.

But later it was saved and now expected to be found in next workflows. Other devices also tried to find the old cache, but could not find it

And if you check the Caches now, you can see 2 new caches saved. It explaiins that, caches were saved for the new dependency change

Note: We have been facing errors from code-coverage. Let’s solve this by using continue-on-error which we did earlier as well.

Once pushed, a new workflow gets triggered.

This time it shows success despite having error on the step “Check Code Coverage”

Docker Login

As soon as the unit testing and code coverage is done, we should move to push this to docker.

This is the new job we need to create on the same file.

Here it checks the content of the repo and logs in. To login, we need to save our Dockerhub username and password in the repository level.

Provide your own username and password here

Then in the workflow, we have filled these

Let’s commit and push this change

Here we can see that, this Containerization depends on the success of other states.

Let’s remove the macos-latest to make the process faster and then push it again

This time, it took very less time

It worked perfectly.

Build the docker image and test it

We will use the verified action maintained by docker

Let’s add the new step to build our docker image in our last job

As we are testing the image, we won’t push it. Also, in the context, we provide the path where dockerfile is.

As we have the dockerfile in the root, we will use .

In the dockerfile, we generally mention how to build the docker image.

Firstly we will install nodejs alpine image (3.17), then create a work directory (/usr/app), Then we copy the package.json and package-lock.json to the /usr/app folder. Then install the dependencies and other fromthe package.json file using npm install. Then copy remaining all files from the solar-system repo.

Then we share the Mongodb database values to connect to it.Then we expose the 3000 port to run nodejs appliction of ours. Finally we expect to run npm run command.

Also, we are assigning tags to our image with dockerhub username and the sha value of the commit (ultimately will be used to represent the version)

Firstly in the testing phase, we will check all the docker images we have, then run the docker image we created in the earlier step. ${{ env.DOCKERHUB_USERNAME }}/solar-system:${{ github.sha }} gives the image.

Then we use the wget to test the Rest endpoints on /live

Note: we are not running this on the IP rather on the VM’s 127.0.0.1:3000

This is why we used 127.0.0.1:3000/live as this /live end point will send us some result.

Let’s commit the changes now!!

Here is the workflow

You can see that we have created a docker image (docker.io/mitul3737/solar-system:e24bfd0f864f5124c80133025c6f341515be2c57) and also got a response from the /live end point ({"status":"live","timestamp":"2025-06-11T11:34:17.630Z"})

As we have successfully tested that, we can now push it

Once the pushing is done,

We can verify the docker image push in the Docker Hub

GitHub Packages

GitHub Packages is a package hosting service (like npm, Docker, Maven, NuGet, etc.) integrated with GitHub. GitHub Actions can publish, install, or manage packages in GitHub Packages as part of CI/CD workflows.

The GITHUB_TOKEN is an automatically generated secret that allows GitHub Actions to authenticate with GitHub's APIs, including GitHub Packages.

GitHub Container Registry (GHCR)

Let’s push our code to GitHub Container Registry (GHCR)

We have added these lines to login to GHCR with our github username and GITHUB_TOKEN( It’s generated each time we start the workflow and authenticate . It’s the alternative of our github password)

Finally , to push the docker image to both dockerhub and container registry, we just modified the earlier step.

Also, we need to add permission for the GitHub package and Checkout

Now, the workflow looks like this

And there is a docker image pushed in DockerHub and a GitHub package created in your repository

Publish Node.js packages

Check this page

The name and version fields in the package.json file create a unique identifier that registries use to link your package to a registry. You can add a summary for the package listing page by including a description field in the package.json file. For more information, see Creating a package.json file and Creating Node.js modules in the npm documentation.

When a local .npmrc file exists and has a registry value specified, the npm publish command uses the registry configured in the .npmrc file. You can use the setup-node action to create a local .npmrc file on the runner that configures the default registry and scope.

Let’s import this repo as our private repo

Then go to the vs code

We have the publish-package.yaml existing here

This workflow triggers when a new release is published, the job is running on ubuntu device. About steps, we check the content of the repository (checkout repository), then we set the nodejs version, install dependencies for the nodejs projects and finally publish to GitHub using the GITHUB_TOKEN.

Also we need to give permissions (to check the repository content, set contents: read, to give package access, set package as write)

Once done, we need to update our package.json file

Here, we have added the package name, author, homepage as Linkedin etc.Let’s commit these 2 files

Once published, no workflow will be triggered. It’s because this workflow will run when a package is released/published.

Let’s go to our repository on GitHub

And press Create a new release.

As a new release has been created, a new workflow is triggered.

We can see the workflow result here

This is our new package

Once you go to your repository now, you can see that as well

Here we go…this is our npm package for this project

Also, in the GitHub account’s packages, we can see this

Scenario: Assume the production database (the mongodb database we created earlier) became laggy and slow.

Let’s analyze the last action we created in the solar-system repo.

Here the unit testing job and code coverage job is both connecting to the production database.

This makes the production database overwhelmed with testing and all. To solve this issue, we can use Service containers for unit testing and code coverage. In this way, a mock database will be created and connected to these job. In this way, the pressure on the production database will be released.

Job containers

Firstly, we need to analyze how workflow works in a runner

For example, in the left side we have a workflow yaml file which should be running on ubuntu-latest. ubuntu-latest runs on Azure. This azure VM has some pre configured packages to help us run what we mention in the workflow.

When the commands run on the GitHub hosted runner, it takes some time to complete the steps mentioned in the workflow.

Whereas , job container in GitHub Actions is docker container used to run the steps in a job.It provides isolation, security and many more. In this way, no conflict happens.

Here is a workflow which employs a job container to execute a custom container that includes Node.js runtime version 20 (image: ghcr.io/node-and-packages:20) and necessary testing packages.

In this way, within the GitHub hosted runner, a docker container is created and the steps are executed within it. So, it remains isolated from other steps and docker containers use cache and others to make it much faster process.

Service containers

They are also docker containers that provide a simple and portable way for one to host services that need to be tested in a workflow. For example, unit testing, code coverage etc.

In our case, we had to unit tests , code coverage etc for which we were using production database.

For tasks like unit testing or code coverage, we can connect these jobs with some service containers which is here mongodb-service. It’s main task is to work as a mock database. Now, once we connect these unit-testing with this Job Container, we are not populating our real production database (given in our github workflow) rather we are using a same sort of database in a container. We used port mapping to connect them then.

Best solution

As we learned about the job container earlier, now we can use this too. Then this job container will remain isolated too and connect with the Service container (mock mongodb)

Also, they will be easily connected as they are containers part of the same virtual machine (ubuntu-latest works in the Azure’s Virtual Machine. And in the VM , we generally do run our steps. Here, we are created job container service container. And then they seamlessly contact each other as they have a private IP which is within the range of each other. This is because they are part of the same VM)

Let’s apply this concept to our workflow now.

This was our workflow in the solar-system repositories feature/explore branch.

Here the unit testing job and code coverage job are dependent on our production database mentioned in the env variable.

So, let’s create the service container in the job level ( meaning we are creating mongo-db database which has work to provide service and we have kept that under the jobs, so that unit testing and other jobs can use it)

Here we are using this docker image

and port mapped the 27017 port from the docker container to the runner’s (ubuntu-latest running on azure) port . This time the container can be accessed using localhost.

Once we have created the service container for the mongo-db, we need to access it, so let’s se the env variable in the job level (under jobs)

Then update app.js

The workflow should work now!
Also for the code coverage, we have to create the job container and then connect to the mongo-db service container.

Let’s push it. This is the workflow now.

Done!!

0
Subscribe to my newsletter

Read articles from Md Shahriyar Al Mustakim Mitul directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Md Shahriyar Al Mustakim Mitul
Md Shahriyar Al Mustakim Mitul