How I Completed the Cloud Resume Challenge: A Personal Journey

Sagnik PalSagnik Pal
4 min read

Last month, while browsing the web, I came across Forrest Brazeal's "Cloud Resume Challenge," which I found both interesting and timely. I was beginning to explore my interest in cloud services and various DevOps tools like Docker, Kubernetes, and CI/CD Pipelines. So, I decided that in the coming week, I would try to complete this challenge since I had a basic understanding of these services.

Infrastructure And Automation:

The first part which I wanted to tackle was the entire infrastructure of the challenge and then fill in the parts required for display. To add my own twist to this challenge, I planned on incorporating the principle of Infrastructure As Code (IaC) using Terraform.

I chose Terraform over AWS SAM or other options because it's an industry standard and I personally prefer it. At first, I set up the entire infrastructure using a "ClickOps" approach, manually selecting and configuring all the necessary services.

The stuff I struggled the most were the

  • Resolving my subdomain to the CloudFront distribution using Route 53

  • Origin Access Control Policies of CloudFront

these problems were pretty trivial after I actually solved them. But they emphasised the fact that issues can be solved faster after reducing the possible cause set one by one. After this I created the API-Lambda-DynamoDB Pipeline which was pretty easy in my opinion. Now that my entire architecture was created, I moved forward with automation.

Automating with Terraform, was the difficult and creative part, this led to me spending hours with the Terraform Registry, going through examples and what not. But, the reward was pretty satisfying after my entire infrastructure could be spun up using just two commands. Navigating through this part taught me the importance of reading documentations and examples.

Some tips I want to share is,

  • Do not try to automate the Route53 and Certificate Creation.

  • For now do that manually and implement the other parts as code.

  • Write a config.tf using the following format below and you should be good to go with your code.

terraform {
required_providers {
  aws = {
    source  = "hashicorp/aws"
    version = "~> 5.0"
  }
}
}

provider "aws" {
  region = "us-east-1"
}

locals {
  s3_bucket_name      = "some unique name"
  domain              = "domain you bought"
  hosted_zone_id      = "ID of the Route53 hosted zone"
  cert_arn            = "arn:aws:acm:XXXXXXXXXXXXXXXX"
  cloudfront_oac_name = "some name"
}

Frontend and CI/CD Pipeline:

The reason I wanted to address the frontend and CI/CD pipeline together is that I didn't want to manually update my files in the S3 bucket after each iteration. Implementing a CI/CD pipeline seemed like the perfect solution for this.

For frontend I chose a template from the GitHub of Bartosz Jarocki, I truly am grateful for that. As it is a nightmare for me to figure out my own designs. I can work better and faster with templates. The template was in NEXT.js and after studying it for a while I implemented a pretty great navbar and also added some extra content.

For the visitor count, you can keep a dummy url for now and update it with the pubic url of the API Gateway after terraform apply.

name: Build website and upload to s3

on:
  push:
    branches:
    - main

jobs:
  build-website-and-upload-to-s3:
    runs-on: ubuntu-latest
    steps:
      - name: checkout
        uses: actions/checkout@v3

      - uses: actions/setup-node@v3
        with:
          node-version: "22.x"

      - name: Install yarn
        run: npm install -g yarn

      - name: Install dependencies
        run: cd ./frontend/ && yarn install

      - name: Build the out file
        run: cd ./frontend/ && npm run build

      - name: Syncing to s3
        uses: jakejarvis/s3-sync-action@master
        with:
          args: --follow-symlinks --delete --exclude '.git/*' --size-only
        env:
          AWS_S3_BUCKET: ${{ secrets.AWS_S3_BUCKET }}
          AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
          AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          AWS_REGION: 'us-east-1'  
          SOURCE_DIR: 'frontend/out'

  invalidate-cloudfront-cache:
    needs: build-website-and-upload-to-s3
    runs-on: ubuntu-latest
    steps:
      - name: Invalidate CloudFront
        uses: chetan/invalidate-cloudfront-action@v2
        env:
          DISTRIBUTION: ${{ secrets.CLOUDFRONT_DISTRIBUTION }}
          PATHS: "/*"
          AWS_REGION: "us-east-1"
          AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
          AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}

This is the CI/CD pipeline I made for github actions and the true beauty of this pipeline is that it works with any kind of NEXT.js project which would want some cloud infrastructure implementation.

Backend:

For the backend, it was a pretty straightforward API Gateway-Lambda-DynamoDB implementation. The code for the Lambda function is

import json
import boto3
from decimal import Decimal

client = boto3.client('dynamodb')
dynamodb = boto3.resource("dynamodb")
table = dynamodb.Table('resumeVisitorTable')
tableName = 'resumeVisitorTable'

def lambda_handler(event, context):
    try:
        statusCode = 200
        response = table.update_item(
            Key={
                "id": "visitors-count"},
            UpdateExpression="SET visitors = visitors + :val",
            ExpressionAttributeValues={':val': 1},
            ReturnValues="UPDATED_NEW"
        )
        body = json.dumps({"count": int(response['Attributes']['visitors'])})

    except Exception as e:
        statusCode = 400
        body = json.dumps({"error": str(e)})

    apiRes = {
        "statusCode": statusCode,
        'headers': {
            'Content-Type': 'application/json',
            'Access-Control-Allow-Methods': 'OPTIONS,POST,GET'
        },
        "body": body
    }

    return apiRes

There are certain resources which can help with this part

Resources:

Conclusion:

This was truly a fun challenge to work with, I am also working on the securing supply chain mod of this challenge. So there would also be future blogs regarding that.

Finally you can checkout my links below

2
Subscribe to my newsletter

Read articles from Sagnik Pal directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Sagnik Pal
Sagnik Pal