Building a Cloud-Native Code Compiler with Django and AWS ECS: A Learning Journey
Introduction
Hey fellow developers! Today I'm excited to share my journey building an online code compilation system that combines Django with AWS ECS. This project started as an exploration into cloud-native applications and evolved into a practical learning experience with containerization, cloud deployment, and infrastructure as code.
Project Overview
The system allows users to write and execute code in their browsers, supporting multiple programming languages (Python, Java, C, C++, and JavaScript). While there are many sophisticated online compilers available, building one from scratch provided valuable insights into cloud infrastructure, containerization, and secure code execution.
Key Features
Multi-language support (Python, Java, C, C++, JavaScript)
Browser-based code execution
Input/output handling
Theme customization (Dracula theme default)
Containerized deployment on AWS ECS
Infrastructure as Code using Terraform
Automated deployment via GitHub Actions
Technical Implementation
Frontend Components
The frontend is intentionally kept minimal and functional, using a simple form-based approach:
<form method="post" class="h-full flex flex-col">
{% csrf_token %}
<div class="flex space-x-4 mb-4">
<select name="selected-language" class="...">
<option value="python">Python</option>
<option value="java">Java</option>
<option value="c">C</option>
<option value="cpp">C++</option>
<option value="javascript">JavaScript</option>
</select>
<!-- Theme selector and other controls -->
</div>
<div class="flex flex-1">
<!-- Code editor and output sections -->
</div>
</form>
Backend Implementation
The core of the application is a Django view that handles code execution. Here's the complete implementation:
def compiler(request):
code = request.POST.get('code', '')
input_data = request.POST.get('input', '')
output = ""
selected_theme = request.POST.get('selected-theme', 'dracula')
selected_language = request.POST.get('selected-language', 'python')
if request.method == 'POST':
try:
# Create unique filename for this compilation
filename = f"code_{uuid.uuid4()}"
if selected_language == 'python':
with open(f"{filename}.py", "w") as f:
f.write(code)
process = subprocess.Popen(
["python", f"{filename}.py"],
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE
)
stdout, stderr = process.communicate(input=input_data.encode(), timeout=5)
os.remove(f"{filename}.py")
output = stdout.decode() + stderr.decode()
elif selected_language == 'javascript':
with open(f"{filename}.js", "w") as f:
f.write(code)
process = subprocess.Popen(
["node", f"{filename}.js"],
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE
)
stdout, stderr = process.communicate(input=input_data.encode(), timeout=5)
os.remove(f"{filename}.js")
output = stdout.decode() + stderr.decode()
elif selected_language == 'java':
class_name = f"Code_{uuid.uuid4().hex}"
with open(f"{class_name}.java", "w") as f:
f.write(f"public class {class_name} {{\n")
f.write(" public static void main(String[] args) {\n")
f.write(code)
f.write("\n }\n}")
compile_process = subprocess.Popen(
["javac", f"{class_name}.java"],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE
)
compile_stdout, compile_stderr = compile_process.communicate(timeout=5)
if compile_process.returncode == 0:
run_process = subprocess.Popen(
["java", class_name],
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE
)
stdout, stderr = run_process.communicate(input=input_data.encode(), timeout=5)
output = stdout.decode() + stderr.decode()
else:
output = compile_stderr.decode()
os.remove(f"{class_name}.java")
if os.path.exists(f"{class_name}.class"):
os.remove(f"{class_name}.class")
elif selected_language in ['c', 'cpp']:
extension = 'c' if selected_language == 'c' else 'cpp'
compiler = 'gcc' if selected_language == 'c' else 'g++'
with open(f"{filename}.{extension}", "w") as f:
f.write(code)
compile_process = subprocess.Popen(
[compiler, f"{filename}.{extension}", "-o", filename],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE
)
compile_stdout, compile_stderr = compile_process.communicate(timeout=5)
if compile_process.returncode == 0:
run_process = subprocess.Popen(
[f"./{filename}"],
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE
)
stdout, stderr = run_process.communicate(input=input_data.encode(), timeout=5)
output = stdout.decode() + stderr.decode()
else:
output = compile_stderr.decode()
os.remove(f"{filename}.{extension}")
if os.path.exists(filename):
os.remove(filename)
except subprocess.TimeoutExpired:
output = "The program exceeded the time limit of 5 seconds."
except Exception as e:
output = f"An error occurred: {str(e)}"
return render(request, 'compiler/compiler.html', {
'code': code,
'input_data': input_data,
'output': output,
'selected_theme': selected_theme,
'selected_language': selected_language
})
Infrastructure Setup
The project uses AWS ECS Fargate for deployment, with infrastructure defined in Terraform:
# ECS Cluster
resource "aws_ecs_cluster" "compiler_cluster" {
name = "online-compiler-cluster"
}
# Task Definition
resource "aws_ecs_task_definition" "compiler_task" {
family = "compiler-task"
requires_compatibilities = ["FARGATE"]
network_mode = "awsvpc"
cpu = 256
memory = 512
container_definitions = jsonencode([
{
name = "compiler-container"
image = "${var.container_image}"
# Container configuration
}
])
}
# VPC Configuration
resource "aws_vpc" "compiler_vpc" {
cidr_block = "10.0.0.0/16"
tags = {
Name = "online-compiler-vpc"
}
}
# Security Group
resource "aws_security_group" "compiler_sg" {
name = "compiler-security-group"
description = "Security group for online compiler"
vpc_id = aws_vpc.compiler_vpc.id
# Security rules
}
CI/CD Pipeline
The deployment process is automated using GitHub Actions:
name: Deploy Infrastructure
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
workflow_dispatch:
jobs:
terraform:
runs-on: ubuntu-latest
steps:
- name: Checkout Repository
uses: actions/checkout@v4
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v4
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: ${{ env.AWS_REGION }}
- name: Terraform Init
run: terraform init
- name: Terraform Apply
run: terraform apply -auto-approve
Deployment Guide
Prerequisites
AWS Account
GitHub Account
Terraform installed locally (for manual deployment)
AWS CLI configured
GitHub Actions Deployment
Fork the repository
Add AWS credentials as repository secrets:
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
Push to main branch or create a PR
⚠️ Important: Run destroy.yml workflow after testing
Manual Deployment
Clone the repository
Navigate to terraform directory
Run:
terraform init
terraform plan
terraform apply
Retrieve ECS cluster IP manually
After testing:
terraform destroy -auto-approve
Current Limitations
Technical Constraints
Security
Basic file system operations
Direct process execution
Limited input validation
Infrastructure
No auto-scaling
Dynamic IP allocation
Single container for all languages
No load balancing
Features
No code persistence
Basic error handling
Limited resource management
No user sessions
Future Improvements
High Priority
Security Enhancements
Implement proper sandboxing
Add comprehensive input validation
Use temporary directories
Add rate limiting
Infrastructure
Add Application Load Balancer
Configure static IP
Implement auto-scaling
Set up proper monitoring
Architecture
Convert to REST API
Add request queuing
Implement proper containerization
Add user authentication
Learning Outcomes
Building this project provided valuable experience in:
Cloud Infrastructure
AWS ECS deployment
Container orchestration
VPC networking
IAM role management
DevOps Practices
Infrastructure as Code
CI/CD pipelines
Container management
Resource optimization
Backend Development
Django application structure
Process management
Error handling
File system operations
Important Warnings ⚠️
AWS Resources
Always run destroy workflow after testing
Verify resource cleanup in AWS console
Monitor AWS billing
Check action workflow logs
Security Considerations
Current implementation is for learning
Not production-ready without enhancements
Basic security implementations
Limited resource restrictions
Get Involved
I'm actively working on improvements and would love to collaborate! You can:
Submit pull requests
Suggest improvements
Share your experiences
Discuss cloud infrastructure
Conclusion
This project serves as a practical learning exercise in cloud-native development. While it has limitations, it provides valuable insights into container orchestration, cloud infrastructure, and automated deployments.
Remember to star the repository if you found it useful, and feel free to reach out for discussions or contributions!
#AWS #Django #Docker #CloudComputing #WebDevelopment #Programming #DevOps #OpenSource #Learning
Kanav Gathe
https://github.com/SlayerK15/Online-Compiler
https://www.linkedin.com/in/gathekanav/
31-10-2024
Subscribe to my newsletter
Read articles from Kanav Gathe directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by