Three-Tier Architecture: Containerizing Frontend, Backend, and Database in Perfect Harmony

In the modern digital landscape, the applications we interact with daily—from e-commerce platforms to social media sites—are typically built using a three-tier architecture. This architectural pattern organizes applications into distinct layers, making them more maintainable, scalable, and efficient.
This guide walks you through deploying a three-tier web application using Docker Compose, creating a seamless DevOps pipeline across Development, Quality Assurance (QA), and Production environments.
Understanding Three-Tier Web Application Architecture
A three-tier architecture separates an application into three interconnected layers:
1. Presentation Layer (Frontend)
This is what users see and interact with—the visual interface including buttons, forms, images, and navigation elements.
Technologies commonly used:
HTML (structure)
CSS (styling)
JavaScript (interactivity)
2. Logic Layer (Backend)
The "brain" of the application that processes requests, executes business logic, and communicates between the presentation and data layers.
Technologies commonly used:
Programming languages: Python, Java, Node.js, Ruby
Frameworks: Django, Spring, Express
APIs for communication between layers
3. Data Layer (Database)
The storage system that maintains all persistent information, from user profiles to transaction records.
Technologies commonly used:
SQL databases: MySQL, PostgreSQL
NoSQL databases: MongoDB
Cloud databases: AWS RDS, Google Cloud SQL
How These Layers Interact
Imagine searching for "sneakers" on an online store:
You enter your search in the frontend interface
The backend processes this request and queries the database
The database returns matching products to the backend
The backend processes this information and sends it to the frontend
The frontend displays the results on your screen
This separation of concerns makes applications more organized, easier to maintain, and more scalable.
Deployment Strategy Overview
Our deployment approach uses Docker Compose to containerize and manage the three layers of our application. We'll implement a comprehensive DevOps pipeline with three distinct phases:
Local Testing - Ensuring everything works correctly in the development environment
CI/CD Pipeline Setup - Implementing automated testing and deployment
Multi-Environment Pipeline - Establishing separate pipelines for Development, QA, and Production
Phase 1: Testing Locally
Step 1: Set Up a Base EC2 Instance
Launch an Ubuntu EC2 instance in AWS
Create a new key pair for secure access
Configure security groups to open necessary ports (HTTP, HTTPS, and application-specific ports)
Step 2: Configure IAM Role for EC2
An IAM role grants your EC2 instance permissions to interact with other AWS services.
Create a new IAM role with AdministratorAccess policy
Attach this role to your EC2 instance via Actions → Security → Modify IAM Role
on the search bar type IAM
click on Roles on the left side
click on create role and choose EC2 from the dropdown
click on next and choose administrator access on permission sections
Click next and give a name and click on create.
Then click on actions → security → modify IAM role option
Step 3: Clone the Repository
Connect to your EC2 instance using your preferred tools and clone the application repository:
sudo su |
Step 4: Necessary tools install and check your terminal if everything is installed or not?
bash script.sh |
Step 5: Deploy Locally Using Docker Compose
Launch all containers with a single command:
docker-compose up -d |
This will create and connect three containers: frontend, backend, and database.
Access your application at http://[public-ip]:5173
As you see featured section is not accessible to do that we have to run a following commands
docker ps |
To populate the database with sample data:
docker exec -it [mongo-container-name] mongoimport --db wanderlust --collection posts --file ./data/sample_posts.json --jsonArray |
Phase 2: Setting Up the CI/CD Pipeline
Step 1: Set Up SonarQube
Access SonarQube at http://[public-ip]:9000
Log in with default credentials (admin/admin)
Set a new password when prompted
Step 2: Set Up Jenkins
Access Jenkins at http://[public-ip]:8080
Retrieve the initial admin password:
sudo cat /var/lib/jenkins/secrets/initialAdminPassword |
3. Install suggested plugins
4. Create your Jenkins admin user
Step 3: Install Required Jenkins Plugins
Eclipse Temurin Installer
SonarQube Scanner
Docker Compose Build Step
NodeJS Plugin
OWASP Dependency-Check
Prometheus metrics
Docker-related plugins
Step 4: Set Up Credentials
SonarQube Credentials:
Generate a token in SonarQube (Security → Users → Tokens)
Add this as a Secret Text credential in Jenkins with ID "sonar-token"
Docker Hub Credentials:
- Add your Docker Hub username and password as credentials in Jenkins with ID "docker-cred"
Step 5: Configure Jenkins Tools
- Set up JDK 17
- Configure Node.js 16.2.0
- Set up Docker
- Configure SonarQube Scanner
- Set up OWASP Dependency-Check
Step 6: Create and Run the Pipeline
Create a new pipeline job in Jenkins with the following script:
pipeline { |
This pipeline will:
Check out code from GitHub
Install dependencies
Run SonarQube analysis for code quality
Perform security scanning with OWASP and Trivy
Build Docker images
Push images to Docker Hub
Deploy the application
Phase 3: Multi-Environment Pipeline Setup
Why Multiple Environments Matter
The path from development to production requires multiple environments to ensure quality at each stage:
1. Development Environment
Purpose: Safe testing ground for new features and bug fixes
Users: Developers
Goal: Catch issues early in the development process
2. Quality Assurance Environment
Purpose: Thorough testing in controlled conditions
Users: QA teams
Goal: Identify bugs, performance issues, and ensure stability
3. Production Environment
Purpose: Live environment for real users
Users: Customers/end-users
Goal: Provide stable, reliable service
Development Pipeline
pipeline { |
QA Pipeline
pipeline { |
Production Pipeline
pipeline { |
Conclusion
This comprehensive approach to deploying a three-tier web application using Docker Compose and implementing a full DevOps pipeline ensures:
Reliability: Through systematic testing across multiple environments
Quality: Via automated code quality and security checks
Efficiency: With containerization and automation
Scalability: By separating concerns into distinct layers
Security: Through multiple scanning and testing stages
By following this guide, you'll establish a robust framework for developing, testing, and deploying modern web applications with confidence and consistency.
Subscribe to my newsletter
Read articles from Subroto Sharma directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Subroto Sharma
Subroto Sharma
I'm a passionate and results-driven DevOps Engineer with hands-on experience in automating infrastructure, optimizing CI/CD pipelines, and enhancing software delivery through modern DevOps and DevSecOps practices. My expertise lies in bridging the gap between development and operations to streamline workflows, increase deployment velocity, and ensure application security at every stage of the software lifecycle. I specialize in containerization with Docker and Kubernetes, infrastructure-as-code using Terraform, and managing scalable cloud environments—primarily on AWS. I’ve worked extensively with tools like Jenkins, GitHub Actions, SonarQube, Trivy, and various monitoring/logging stacks to build secure, efficient, and resilient systems. Driven by automation and a continuous improvement mindset, I aim to deliver value faster and more reliably by integrating cutting-edge tools and practices into development pipelines.