Three-Tier Architecture: Containerizing Frontend, Backend, and Database in Perfect Harmony

Subroto SharmaSubroto Sharma
13 min read

In the modern digital landscape, the applications we interact with daily—from e-commerce platforms to social media sites—are typically built using a three-tier architecture. This architectural pattern organizes applications into distinct layers, making them more maintainable, scalable, and efficient.

This guide walks you through deploying a three-tier web application using Docker Compose, creating a seamless DevOps pipeline across Development, Quality Assurance (QA), and Production environments.

Understanding Three-Tier Web Application Architecture

A three-tier architecture separates an application into three interconnected layers:

1. Presentation Layer (Frontend)

This is what users see and interact with—the visual interface including buttons, forms, images, and navigation elements.

Technologies commonly used:

  • HTML (structure)

  • CSS (styling)

  • JavaScript (interactivity)

2. Logic Layer (Backend)

The "brain" of the application that processes requests, executes business logic, and communicates between the presentation and data layers.

Technologies commonly used:

  • Programming languages: Python, Java, Node.js, Ruby

  • Frameworks: Django, Spring, Express

  • APIs for communication between layers

3. Data Layer (Database)

The storage system that maintains all persistent information, from user profiles to transaction records.

Technologies commonly used:

  • SQL databases: MySQL, PostgreSQL

  • NoSQL databases: MongoDB

  • Cloud databases: AWS RDS, Google Cloud SQL

How These Layers Interact

Imagine searching for "sneakers" on an online store:

  1. You enter your search in the frontend interface

  2. The backend processes this request and queries the database

  3. The database returns matching products to the backend

  4. The backend processes this information and sends it to the frontend

  5. The frontend displays the results on your screen

This separation of concerns makes applications more organized, easier to maintain, and more scalable.

Deployment Strategy Overview

Our deployment approach uses Docker Compose to containerize and manage the three layers of our application. We'll implement a comprehensive DevOps pipeline with three distinct phases:

  1. Local Testing - Ensuring everything works correctly in the development environment

  2. CI/CD Pipeline Setup - Implementing automated testing and deployment

  3. Multi-Environment Pipeline - Establishing separate pipelines for Development, QA, and Production

Phase 1: Testing Locally

Step 1: Set Up a Base EC2 Instance

  1. Launch an Ubuntu EC2 instance in AWS

  2. Create a new key pair for secure access

  3. Configure security groups to open necessary ports (HTTP, HTTPS, and application-specific ports)

Step 2: Configure IAM Role for EC2

An IAM role grants your EC2 instance permissions to interact with other AWS services.

  1. Create a new IAM role with AdministratorAccess policy

  2. Attach this role to your EC2 instance via Actions → Security → Modify IAM Role

on the search bar type IAM

click on Roles on the left side

click on create role and choose EC2 from the dropdown

click on next and choose administrator access on permission sections

Click next and give a name and click on create.

Then click on actions → security → modify IAM role option

Step 3: Clone the Repository

Connect to your EC2 instance using your preferred tools and clone the application repository:

sudo su
apt update -y
https://github.com/subrotosharma/Three-Tier-Web-Application-E-commerce.git

Step 4: Necessary tools install and check your terminal if everything is installed or not?

bash script.sh

Step 5: Deploy Locally Using Docker Compose

Launch all containers with a single command:

docker-compose up -d

This will create and connect three containers: frontend, backend, and database.

Access your application at http://[public-ip]:5173

As you see featured section is not accessible to do that we have to run a following commands

docker ps

To populate the database with sample data:

docker exec -it [mongo-container-name] mongoimport --db wanderlust --collection posts --file ./data/sample_posts.json --jsonArray

Phase 2: Setting Up the CI/CD Pipeline

Step 1: Set Up SonarQube

  1. Access SonarQube at http://[public-ip]:9000

  2. Log in with default credentials (admin/admin)

  3. Set a new password when prompted

Step 2: Set Up Jenkins

  1. Access Jenkins at http://[public-ip]:8080

  2. Retrieve the initial admin password:

sudo cat /var/lib/jenkins/secrets/initialAdminPassword

3. Install suggested plugins

4. Create your Jenkins admin user

Step 3: Install Required Jenkins Plugins

  • Eclipse Temurin Installer

  • SonarQube Scanner

  • Docker Compose Build Step

  • NodeJS Plugin

  • OWASP Dependency-Check

  • Prometheus metrics

  • Docker-related plugins

Step 4: Set Up Credentials

SonarQube Credentials:

  1. Generate a token in SonarQube (Security → Users → Tokens)

  2. Add this as a Secret Text credential in Jenkins with ID "sonar-token"

Docker Hub Credentials:

  1. Add your Docker Hub username and password as credentials in Jenkins with ID "docker-cred"

Step 5: Configure Jenkins Tools

  1. Set up JDK 17

  1. Configure Node.js 16.2.0

  1. Set up Docker

  1. Configure SonarQube Scanner

  1. Set up OWASP Dependency-Check

Step 6: Create and Run the Pipeline

Create a new pipeline job in Jenkins with the following script:

pipeline {
    agent any
    tools {
        jdk 'jdk17'
        nodejs 'node16'
    }
    environment {
        SCANNER_HOME = tool 'sonar-scanner'
    }
    stages {

        stage('Checkout from Git') {
            steps {
                git branch: 'main', url: 'https://github.com/subrotosharma/Three-Tier-Web-Application-E-commerce.git'
            }
        }

        stage('Install Dependencies') {
            steps {
                script {
                    dir('backend') {
                        sh 'npm install'
                    }
                    dir('frontend') {
                        sh 'npm install'
                    }
                }
            }
        }

        stage('Prepare Env Files') {
            steps {
                script {
                    sh '''
                        [ ! -f frontend/.env.sample ] || cp frontend/.env.sample frontend/.env.local
                    '''
                }
            }
        }

        stage("Sonarqube Analysis") {
            steps {
                withSonarQubeEnv('sonar-server') {
                    sh '''$SCANNER_HOME/bin/sonar-scanner \
                        -Dsonar.projectName=docker-compose \
                        -Dsonar.projectKey=docker-compose'''
                }
            }
        }

        stage("Quality Gate") {
            steps {
                script {
                    waitForQualityGate abortPipeline: false, credentialsId: 'sonar-token'
                }
            }
        }

        stage('OWASP FS SCAN') {
            steps {
                dependencyCheck additionalArguments: '--scan ./ --disableYarnAudit --disableNodeAudit', odcInstallation: 'DP-Check'
                dependencyCheckPublisher pattern: '**/dependency-check-report.xml'
            }
        }

        stage('TRIVY FS SCAN') {
            steps {
                sh 'trivy fs . > trivyfs.json'
            }
        }

        stage('Docker-compose Build') {
            steps {
                script {
                    timeout(time: 2, unit: 'MINUTES') {
                        sh '''
                            echo "Cleaning up containers if they exist..."
                            docker-compose down
                            docker rm -f mongo || true
                            docker rm -f frontend || true
                            docker rm -f backend || true

                            echo "Building and starting containers..."
                            docker-compose up -d --build --remove-orphans --force-recreate
                        '''
                    }
                }
            }
        }

        stage('Docker-compose Push') {
            steps {
                script {
                    withDockerRegistry(credentialsId: 'docker-cred', toolName: 'docker') {
                        sh '''
                            echo "Tagging and pushing images to Docker Hub..."
                            docker tag devpipeline-backend subrotosharma/devpipeline-backend:latest
                            docker tag devpipeline-frontend subrotosharma/devpipeline-frontend:latest

                            docker push subrotosharma/devpipeline-backend:latest
                            docker push subrotosharma/devpipeline-frontend:latest
                        '''
                    }
                }
            }
        }

        stage('TRIVY Image Scan') {
            steps {
                sh '''
                    trivy image subrotosharma/devpipeline-backend > trivy_backend.json
                    trivy image subrotosharma/devpipeline-frontend > trivy_frontend.json
                '''
            }
        }
    }
}

This pipeline will:

  1. Check out code from GitHub

  2. Install dependencies

  3. Run SonarQube analysis for code quality

  4. Perform security scanning with OWASP and Trivy

  5. Build Docker images

  6. Push images to Docker Hub

  7. Deploy the application

Phase 3: Multi-Environment Pipeline Setup

Why Multiple Environments Matter

The path from development to production requires multiple environments to ensure quality at each stage:

1. Development Environment

  • Purpose: Safe testing ground for new features and bug fixes

  • Users: Developers

  • Goal: Catch issues early in the development process

2. Quality Assurance Environment

  • Purpose: Thorough testing in controlled conditions

  • Users: QA teams

  • Goal: Identify bugs, performance issues, and ensure stability

3. Production Environment

  • Purpose: Live environment for real users

  • Users: Customers/end-users

  • Goal: Provide stable, reliable service

Development Pipeline

pipeline {
    agent any

    stages {
        stage('Checkout Code') {
            steps {
                git 'https://github.com/subrotosharma/Three-Tier-Web-Application-E-commerce.git'
            }
        }

        stage('Build Docker Images') {
            steps {
                sh 'docker-compose build'
            }
        }

        stage('Run Unit Tests') {
            steps {
                sh 'docker-compose run backend pytest'
            }
        }

        stage('Deploy to Development') {
            steps {
                sh 'docker-compose up -d'
            }
        }
    }

    post {
        success {
            echo "Development deployment successful, triggering QA pipeline in 10 seconds..."
            sleep 10
            build job: 'QA_Pipeline'
        }
        failure {
            echo "Development deployment failed."
        }
    }
}

QA Pipeline

pipeline {
    agent any

    stages {
        stage('Pull Docker Images') {
            steps {
                sh 'docker-compose pull'
            }
        }

        stage('Deploy to QA') {
            steps {
                sh 'docker-compose up -d'
            }
        }

        stage('Run Functional Tests') {
            steps {
                sh 'docker-compose run frontend npm run test'
            }
        }

        stage('Run End-to-End Tests') {
            steps {
                sh 'docker-compose run backend behave tests/'
            }
        }

        stage('Run Performance Tests') {
            steps {
                sh 'docker-compose run backend stress --cpu 8 --timeout 10'
            }
        }
    }

    post {
        success {
            echo "QA testing successful, triggering Production pipeline in 10 seconds..."
            sleep 10
            build job: 'Production_Pipeline'
        }
        failure {
            echo "QA testing failed."
        }
    }
}

Production Pipeline

pipeline {
    agent any

    stages {
        stage('Pull Docker Images') {
            steps {
                sh 'docker-compose pull'
            }
        }

        stage('Manual Approval') {
            steps {
                input message: 'Approve deployment to production?'
            }
        }

        stage('Deploy to Production') {
            steps {
                sh 'docker-compose up -d'
            }
        }

        stage('Run Smoke Tests') {
            steps {
                sh 'curl http://your-production-url'
            }
        }

        stage('Post-Deployment Notifications') {
            steps {
                echo "Deployment to production was successful!"
                // Add notification steps (Slack, email, etc.)
            }
        }
    }

    post {
        success {
            echo "Production deployment successful."
        }
        failure {
            echo "Production deployment failed."
        }
    }
}

Conclusion

This comprehensive approach to deploying a three-tier web application using Docker Compose and implementing a full DevOps pipeline ensures:

  1. Reliability: Through systematic testing across multiple environments

  2. Quality: Via automated code quality and security checks

  3. Efficiency: With containerization and automation

  4. Scalability: By separating concerns into distinct layers

  5. Security: Through multiple scanning and testing stages

By following this guide, you'll establish a robust framework for developing, testing, and deploying modern web applications with confidence and consistency.

0
Subscribe to my newsletter

Read articles from Subroto Sharma directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Subroto Sharma
Subroto Sharma

I'm a passionate and results-driven DevOps Engineer with hands-on experience in automating infrastructure, optimizing CI/CD pipelines, and enhancing software delivery through modern DevOps and DevSecOps practices. My expertise lies in bridging the gap between development and operations to streamline workflows, increase deployment velocity, and ensure application security at every stage of the software lifecycle. I specialize in containerization with Docker and Kubernetes, infrastructure-as-code using Terraform, and managing scalable cloud environments—primarily on AWS. I’ve worked extensively with tools like Jenkins, GitHub Actions, SonarQube, Trivy, and various monitoring/logging stacks to build secure, efficient, and resilient systems. Driven by automation and a continuous improvement mindset, I aim to deliver value faster and more reliably by integrating cutting-edge tools and practices into development pipelines.