🚀 My AWS Cloud Quest: Cloud Practitioner Journey — A Gamified Path to Cloud Mastery

🌟 Introduction

Cloud computing isn’t just the future — it’s the present. With organizations rapidly shifting to the cloud, having AWS skills is now a superpower for students, developers, and aspiring tech professionals.

Recently, I completed AWS Cloud Quest: Cloud Practitioner and earned my official AWS Cloud Quest badge 🏆.

This was not just another online course — it was a gamified, hands-on journey where I learned real AWS concepts by solving virtual challenges and building a city.

In this blog, I’ll share:

  • My experience with AWS Cloud Quest

  • The gamified learning approach

  • Key skills I gained

  • Why this is a golden opportunity for students

💡 If you’re a college student or beginner in cloud, this can be your gateway to mastering AWS.

🎮 What is AWS Cloud Quest?

AWS Cloud Quest is a role-based learning game designed to teach you AWS Cloud fundamentals while keeping you engaged through missions and challenges.

You create a virtual avatar, explore a city, solve problems for “customers,” and complete hands-on labs using real AWS services.

Main Features:

  • 🎯 Role-based learning (Cloud Practitioner track in my case)

  • 🏗 12 real-world hands-on assignments

  • 📚 Quizzes, videos, and architecture diagrams

  • 🏆 Competitive “Tournament Mode”

  • đź’Ľ Earn an official AWS badge (shareable via Credly)

🗺 My Journey — Level by Level

AWS Cloud Quest: Cloud Practitioner consists of 12 missions and each mission includes a timer that tracks how much time you spend completing the DIY (Do It Yourself) exercises. This helps you monitor your progress and efficiency as you work through hands-on labs.

Here’s what I learned at each stage:

Level 1: Cloud Computing Essentials

In this first mission, I learned the foundational concepts of cloud computing and how AWS services can be used to improve website reliability.

Challenge:
The task was to migrate an existing website to static website hosting on Amazon S3.

Hands-On Work:

  • Enabled static website hosting on an Amazon S3 bucket

  • Configured a bucket policy to make the site accessible securely

  • Practiced customizing the site by renaming index.html to waves.html

Key Learning:
I understood how Amazon S3 can serve as a cost-effective, highly available solution for hosting static websites, and how security is maintained through bucket policies.

Outcome:
Completed the migration successfully and saw the hosted site live via the S3 endpoint — a great start to the AWS Cloud Quest journey!

Level 2: Cloud First Steps

This mission focused on launching and configuring Amazon EC2 instances to support the island’s stabilization system, while learning about AWS regions and availability zones.

Challenge:
Create Amazon EC2 instances in multiple Availability Zones to improve reliability and ensure system stability.

Hands-On Work:

  • Launched the first EC2 instance in a chosen AWS Region

  • Configured a user data script so the instance automatically displayed its details in a web browser upon launch

  • Deployed a second EC2 instance in a different Availability Zone within the same region for high availability

Key Learning:
I learned the basics of launching EC2 instances, automating configuration with user data scripts, and using multiple Availability Zones to make systems more fault-tolerant.

Outcome:
Both EC2 instances ran successfully, each in different Availability Zones, demonstrating AWS’s approach to high availability and disaster resilience.

Level 3: Computing Solutions

This mission was all about understanding Amazon EC2 instance types and how scaling compute resources can improve application performance.

Challenge:
Upgrade an existing EC2 instance to a larger instance type to handle increased performance demands.

Hands-On Work:

  • Explored various EC2 instance families and their use cases

  • Filtered EC2 instances based on attributes (vCPUs, memory, etc.)

  • Connected to an instance using EC2 Instance Connect

  • Viewed instance metadata through the public IP address

  • Practiced starting and stopping EC2 instances from the console

  • Upgraded the instance from t3.micro to a m4.large general-purpose instance

Key Learning:
I learned how to choose the right EC2 instance type for workloads, scale up compute resources when needed, and manage instances efficiently using AWS tools.

Outcome:
The upgraded instance successfully supported higher performance requirements, demonstrating AWS’s flexibility in scaling computing resources.

Level 4: Networking Concepts

This mission introduced me to Amazon VPC (Virtual Private Cloud) and how its components work together to enable secure communication between resources.

Challenge:
Identify and fix a network connectivity issue by adjusting VPC configurations and security rules.

Hands-On Work:

  • Explored key VPC components (subnets, route tables, internet gateways, and security groups)

  • Configured a route table to direct internet-bound traffic through an internet gateway

  • Adjusted security group inbound rules to allow necessary traffic

  • Specifically enabled port 3306 access for the database server to accept MySQL connections

  • Verified that the web server and DB server could communicate as intended

Key Learning:
I learned how VPC networking is structured, how routing works within AWS, and how security groups act as virtual firewalls to control access.

Outcome:
Fixed the connectivity issue successfully, ensuring the application could communicate securely with the database over port 3306.

Level 5: Cloud Economics

This mission focused on understanding AWS pricing tools and how to estimate costs for variable workloads in Amazon EC2.

Challenge:
Configure a price estimate for an architecture that uses a variable number of EC2 instances during peak usage periods.

Hands-On Work:

  • Created logical pricing groups to organize and categorize resources for cost estimation.

  • Generated an initial price estimate for Amazon EC2 usage based on scaling patterns.

  • Modified the EC2 instance type to t2.micro to reflect the updated requirement.

  • Generated a new price estimate URL using the AWS Pricing Calculator via Skill Builder.

Key Learning:
I learned how to use the AWS Pricing Calculator to estimate costs for scalable workloads, how logical pricing groups help in organizing resources, and how adjusting instance types impacts overall pricing.

Outcome:
Successfully created and updated a cost estimate for EC2 usage with the t2.micro instance type, and generated a new shareable price estimate URL.

Level 6: Databases in Practice

This mission introduced me to Amazon RDS (Relational Database Service) and how to enhance database efficiency, availability, and performance using Multi-AZ deployments and read replicas.

Challenge:
Optimize a database setup for high availability and faster read performance by leveraging AWS database features.

Hands-On Work:

  • Explored AWS database offerings and their use cases

  • Launched an Amazon RDS instance with the required configuration

  • Configured a Multi-AZ deployment to ensure high availability

  • Enabled and configured automatic RDS backups for disaster recovery

  • Created a read replica of the primary database using a db.t3.xlarge instance to improve read scalability

Key Learning:
I learned how AWS RDS provides managed database solutions that improve availability through Multi-AZ setups, and how read replicas offload read queries to enhance performance without affecting the primary database.

Outcome:
Successfully deployed a highly available RDS instance with a read replica, ensuring improved reliability, backup protection, and optimized performance for read-heavy workloads.

Level 7: Connecting VPCs

This mission introduced me to VPC peering in AWS and how to enable secure communication between applications hosted in different VPCs.

Challenge:
Allow seamless communication between applications in the Marketing and Developer VPCs with the Financial Services server located in the Finance department's VPC by setting up VPC peering and proper routing.

Hands-On Work:

  • Explored the concept of VPC peering and its role in connecting isolated networks in AWS

  • Created a VPC peering connection between the Developer and Finance department VPCs

  • Configured route tables in both VPCs to ensure traffic can flow between them

  • Verified connectivity between the Marketing/Developer EC2 instances and the Financial Services server

Key Learning:
I learned how VPC peering enables direct private communication between VPCs without requiring public internet access. Proper route table updates are crucial to ensure successful packet delivery between peered VPCs.

Outcome:
Successfully configured a VPC peering connection between the Developer and Finance department VPCs, enabling secure and reliable communication between their resources while maintaining network isolation from external traffic.

Level 8: First NoSQL Database

This mission introduced me to Amazon DynamoDB and how to create and manage a NoSQL database in AWS.

Challenge:
Track customer viewing data, such as movies watched and device type, by creating a DynamoDB table, adding dynamic attributes, and querying the data.

Hands-On Work:

  • Created a DynamoDB table with a unique userId as the partition key.

  • Added records with dynamic attributes, including a new rating field.

  • Queried the table to retrieve and verify stored data.

Key Learning:
Learned how DynamoDB supports flexible schemas and efficient data retrieval without fixed table structures.

Outcome:
Successfully created and queried a NoSQL DynamoDB table to store and manage customer viewing metadata.

Level 9: File Systems in the Cloud

This mission focused on deploying and managing a shared file system in AWS using Amazon EFS.

Challenge:
Set up an EFS file system accessible from multiple EC2 instances to enable file sharing.

Hands-On Work:

  • Launched and configured an Amazon EFS file system.

  • Mounted the EFS on two EC2 instances and shared files between them.

  • Mounted the same EFS on a third EC2 instance to test accessibility.

Key Learning:
Understood how EFS provides scalable, shared storage for multiple instances with seamless file sharing.

Outcome:
Successfully deployed a shared EFS and verified file access across three EC2 instances.

Level 10: Core Security Concepts

This mission focused on managing permissions in AWS using IAM and the principle of least privilege.

Challenge:
Provide engineers with the required permissions using group settings while restricting unnecessary access.

Hands-On Work:

  • Created an IAM group and added users.

  • Attached an AWS managed policy to the group.

  • Granted the Support group read-only access to an Amazon RDS instance.

Key Learning:
Learned how IAM groups simplify permission management and how the least privilege principle enhances security.

Outcome:
Successfully configured IAM groups and policies, ensuring secure, controlled access to AWS resources.

Level 11: Auto-Healing and Scaling Applications

This mission focused on configuring AWS Auto Scaling to handle workload changes automatically.

Challenge:
Create and configure an Amazon EC2 Auto Scaling group with scheduled scaling to add and remove instances.

Hands-On Work:

  • Created an Auto Scaling group and launched EC2 instances into it.

  • Configured a scheduled scaling policy to add resources daily at 10:00 AM.

Key Learning:
Understood how Auto Scaling maintains performance and cost efficiency by adjusting capacity automatically.

Outcome:
Successfully implemented an Auto Scaling group with scheduled actions for predictable workload changes.

Level 12: Highly Available Web Applications

This mission was about ensuring uptime and reliability through a multi-AZ architecture.

Challenge:
Build a highly available setup using an Application Load Balancer (ALB) with health checks and multi-AZ Auto Scaling.

Hands-On Work:

  • Configured an Auto Scaling group to integrate with an ALB.

  • Set up load balancer health checks for automated instance recovery.

  • Added a second Availability Zone for better fault tolerance.

Key Learning:
Learned how distributing resources across multiple AZs improves availability and resilience.

Outcome:
Successfully deployed a fault-tolerant, load-balanced architecture spanning multiple AZs.

🎯 The Gamified Experience — Learning While Playing

AWS Cloud Quest transforms learning into an interactive adventure:

  • You earn coins and upgrade your virtual city as you complete missions.

  • The tournament mode adds a competitive edge.

  • Solving real-world AWS problems keeps you motivated and job-ready.

This approach works because:

  • It breaks the monotony of traditional video lectures

  • You get instant feedback on labs

  • The sense of progress is visual and rewarding

📚 Skills I Gained

By the end of the course, I had:

  • Practical experience with core AWS services (EC2, S3, RDS, DynamoDB, VPC, IAM)

  • Knowledge of cloud architecture best practices

  • Confidence in cost optimization and security

  • Preparedness for the AWS Certified Cloud Practitioner exam

    My Cloud Quest :Cloud Practitioner Badge

📝 Final Thoughts

AWS Cloud Quest has been one of the most fun and effective ways to learn cloud computing.

It’s not just about getting a badge — it’s about building skills that directly apply to real-world projects.

If you’re a student or beginner, I highly recommend starting your AWS journey here. Who knows? This could be the first step to your dream cloud career.

0
Subscribe to my newsletter

Read articles from Prabhusai Kumbham directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Prabhusai Kumbham
Prabhusai Kumbham

🚀 Aspiring DevOps & Cloud Engineer passionate about automation, scalability, and cloud technologies. Currently exploring AWS, Docker, Kubernetes, Ansible, Terraform, and CI/CD pipelines to build a strong foundation in modern DevOps practices. I love experimenting with cloud infrastructure, containerization, and automation tools to create scalable, secure, and efficient solutions. Always learning, sharing, and connecting with like-minded tech enthusiasts!