CI/CD Pipelines with GitHub Actions and Google Cloud Run

Introduction

In the fast-paced world of software development, Continuous Integration and Continuous Deployment (CI/CD) pipelines have become essential for ensuring code quality and rapid delivery. GitHub Actions, a popular CI/CD tool, combined with Google Cloud Run, offers a powerful and scalable solution for deploying containerized applications. In this article, we'll walk through setting up a CI/CD pipeline using GitHub Actions to deploy a Node.js application with TypeScript and Prisma to Google Cloud Run.

One of the strengths of CI/CD pipelines is their flexibility and adaptability to various languages and frameworks. While the example provided in this article focuses on a Node.js project using TypeScript and Prisma, the overall structure and steps can be easily adapted for different programming languages and ORMs.

Configuring GitHub Actions Workflow

Create a .github/workflows/deploy.yml file in your repository:

on:
  push:
    branches:
      - production

name: Build and Deploy to Cloud Run
env:
  PROJECT_ID: ${{ secrets.GCP_PROJECT }}
  SA_KEY: ${{ secrets.GCP_SA_KEY }}
  GCP_SQL_INSTANCE: ${{ secrets.GCP_SQL_INSTANCE }}
  SERVICE: cloudrun-service-name
  REGION: us-central1

jobs:
  build:
    runs-on: ubuntu-20.04
    timeout-minutes: 10
    strategy:
      matrix:
        node-version: [18.16]
    steps:
      - name: Checkout
        uses: actions/checkout@v2

      - name: Setup Cloud SDK
        uses: google-github-actions/setup-gcloud@v0.2.0
        with:
          project_id: ${{ env.PROJECT_ID }}
          service_account_key: ${{ env.SA_KEY }}

      - name: Authorize Docker push images to GCP
        run: gcloud auth configure-docker

      - name: 'Create env file'
        run: |
          touch .env.production
          echo PORT=3000 >> .env.production
          echo INTERNAL_SERVICE=${{ secrets.INTERNAL_SERVICE }} >> .env.production

      - name: Build and Push image to GCP
        run: |
          docker build -t gcr.io/${{ env.PROJECT_ID }}/${{ env.SERVICE }}:latest .
          docker push gcr.io/${{ env.PROJECT_ID }}/${{ env.SERVICE }}:latest

      - name: Get public IP
        id: ip
        uses: haythem/public-ip@v1.2

      - name: Authorize current IP on GCP SQL
        run: |
          gcloud sql instances patch ${{ env.GCP_SQL_INSTANCE }} \
            --authorized-networks=${{ steps.ip.outputs.ipv4 }} --quiet

      - name: Run migrations
        run: |
          cd $GITHUB_WORKSPACE
          npm install
          NODE_ENV=production npx prisma migrate deploy
        env:
          DATABASE_URL: ${{ secrets.DATABASE_URL_PUBLIC_IP }}

      - name: Remove current IP on GCP SQL
        if: always()
        run: |
          gcloud sql instances patch ${{ env.GCP_SQL_INSTANCE }} \
            --clear-authorized-networks --quiet

      - name: Deploy image pushed to Cloud Run
        run: |
          gcloud run deploy ${{ env.SERVICE }} \
            --region ${{ env.REGION }} \
            --image gcr.io/${{ env.PROJECT_ID }}/${{ env.SERVICE }}:latest \
            --platform "managed" \
            --quiet

This workflow triggers when you merge into the production branch. To summarize: first, I log into my GCP account and authorize Docker usage. Next, I create a production environment file, build the Docker image using my Dockerfile, and push this image to GCP. Finally, I run the migrations from the database associated with my Cloud Run service and, in the end, I deploy the uploaded image to the Cloud Run service.

Secrets in GitHub account:

Add the necessary variables of CI/CD (GCP_PROJECT, GCP_SA_KEY, GCP_SQL_INSTANCE, DATABASE_URL_PUBLIC_IP, INTERNAL_SERVICE) in your GitHub repository settings under Secrets.

  • GCP_PROJECT: The name or ID of your GCP project.

  • GCP_SA_KEY: The key for your service account associated with Cloud Run service;

  • GCP_SQL_INSTANCE: The name or ID of your SQL instance in GCP.

  • DATABASE_URL_PUBLIC_IP: The connection URL for your databae. For example:: postgresql://<username>:<password>@<instance-public-ip>:<port>/<database_name>

Dockerizing the Application

Docker is essential for creating a consistent and reproducible environment for your application. Here's a Dockerfile for a Node.js application using TypeScript and Prisma.

# Build part
FROM node:18.16 as build
WORKDIR /usr/src/app
COPY ./package*.json ./
RUN npm install
COPY . .
RUN npm run build

# Run stage
FROM node:18.16
WORKDIR /usr/src/app
COPY ./package*.json ./
RUN npm install
COPY . .
EXPOSE 3000
COPY --from=build /usr/src/app/dist ./dist

RUN npx prisma generate
CMD ["npm", "run", "start"]

Enhancing the CI/CD Pipeline

To further enhance your CI/CD pipeline, consider integrating automated tests to ensure code quality before deployment. Incorporating linting and security checks can also help identify potential issues early. Typically, these steps are configured to run on pushes to the master or main branches in GitHub, ensuring that only well-tested and secure code is deployed. By automating these processes, you improve the reliability and security of your deployments, ultimately leading to a more robust and efficient development workflow.

Conclusion

In this article, we've covered setting up a CI/CD pipeline using GitHub Actions and deploying a Node.js application with TypeScript and Prisma to Google Cloud Run. By automating the build and deployment process, you can ensure consistent and reliable releases, allowing your development team to focus on building great features. Embrace CI/CD in your projects to enhance productivity and code quality.

1
Subscribe to my newsletter

Read articles from Luis Gustavo Ganimi directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Luis Gustavo Ganimi
Luis Gustavo Ganimi