My Journey Through the AWS Cloud Resume Challenge 😄😎


Embarking on the Cloud Resume Challenge was an eye-opening experience, pushing me to build a cloud-based resume website using AWS services. It wasn’t just about showcasing my resume but proving my cloud skills. Every step of the process taught me something new about cloud infrastructure, automation, and deployment.
I started by crafting my resume using HTML. I wanted a sleek, coded webpage that would give me complete control over styling and future functionality, so I got a template from HTML5. I chose a design with simple fonts, subtle colors, andresponsive design principles, and it came with HTML and CSS files, which I edited to my satisfaction.
Once my resume was designed, I deployed my HTML and CSS files to an S3 bucket configured as a static website and blocked all public access. Using S3 gave me a hands-on understanding of cloud storage and hosting. Security was my next focus, so I distributed it with Amazon CloudFront to enable HTTPS, where I used origin access control and bucket policy to allow access to CloudFront. This secured my site and improved its performance with global content delivery.
To give my site a professional touch, I registered a custom domain and linked it to my CloudFront distribution using Route 53. My resume is live at joshuaidowuresume.com, adding credibility and personalization to my cloud project.
I wanted my resume to be more than static content, so I added a dynamic visitor counter using JavaScript. This made my resume interactive and added a layer of functionality that showcased my front-end skills. But where would I store the visitor count data? I chose DynamoDB, Amazon’s NoSQL database, and opted for on-demand pricing to keep costs low.
Instead of connecting JavaScript directly to DynamoDB — which would have been insecure — I built a secure API using API Gateway and Lambda. This allowed safe, scalable communication between the front end and the database. I wrote my Lambda functions in Python, using the boto3 library to interact with AWS services. Python’s flexibility and wide usage in cloud computing made it the perfect choice.
To maintain high-quality code, I implemented unit tests for my Python functions. Running tests ensured that my API worked flawlessly before deployment. But manual configuration wasn’t the way forward; I needed automation.
Rather than manually setting up cloud resources, I used Terraform to define and provision my infrastructure. My DynamoDB table, API Gateway, Lambda functions, and S3 bucket deployment were automated by writing Terraform configuration files. This saved time and ensured my cloud architecture was consistent and repeatable.
Version control played a crucial role in organizing my project. I created two GitHub repositories: one for my front-end (resume code) and another for my back-end (API and infrastructure). This separation kept my work structured and collaboration-ready.
Automation didn’t stop there. I used GitHub Actions to implement CI/CD pipelines. For the back end, GitHub Actions ran Python tests and automatically deployed my Terraform configurations to AWS. I set up a workflow for the front end that updated my S3 bucket and invalidated the CloudFront cache whenever I pushed the new website code.
Pro Tip: I never committed AWS credentials to GitHub, using IAM roles and secret management best practices to protect my cloud environment.
By completing the Cloud Resume Challenge, I didn’t just build a resume but a cloud-powered portfolio piece. This project took me through every layer of modern web development and cloud architecture, making me confident in deploying real-world solutions.
My cloud portfolio can be found at joshuaidowuresume.com, as shown above.
Subscribe to my newsletter
Read articles from Joshua Idowu directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
