How to Containerize a Django App with MySQL and Nginx


Welcome to the third blog in our series. In this article, we'll guide you through containerizing a Django application with MySQL as the database and Nginx as a reverse proxy. We'll also automate the deployment using GitHub Actions for smooth CI/CD. This setup ensures scalability, easy deployment, and better management.
Project Overview
The project is a simple notes app built with Django. The Django project is named notesapp
, and it uses the mynotes/build
directory, which contains the built React application, in its template (as shown in the figure above). Another Django app is used, called api
, which serves as the backend application to handle requests.
Architecture Overview
Unlike traditional three-tier architectures where frontend, backend, and database run as separate services, our Django application serves the pre-built React frontend from within the backend itself. Thus, our setup consists of:
Django Backend - Serves API endpoints and hosts the frontend build directory.
MySQL Database - Stores application data.
Nginx Reverse Proxy - Forwards requests to Django and handles static content efficiently.
GitHub Actions CI/CD - Automates build and deployment to keep our services up-to-date.
Self-Hosted GitHub Runner - Ensures deployment runs in a controlled environment.
Note: The Docker containers are connected within a Docker network called
myapp-network
, and the MySQL data is stored using a Docker volume namedmysql_data
.
Dockerizing the Application
1. Django Application Dockerfile
# Use the official Python image from the Docker Hub
FROM python:3.9-slim
# Set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
# Set the working directory
WORKDIR /app
# Install system dependencies required for mysqlclient
RUN apt-get update \n && apt-get upgrade -y \n && apt-get install -y gcc default-libmysqlclient-dev pkg-config \n && rm -rf /var/lib/apt/lists/*
# Copy the requirements file into the container
COPY requirements.txt /app/
# Install the dependencies
RUN pip install --upgrade pip
RUN pip install --no-cache-dir -r requirements.txt
# Copy the rest of the application code into the container
COPY . /app/
# Expose the port the app runs on
EXPOSE 8000
This Dockerfile:
Uses
python:3.9-slim
for a lightweight base image.Installs required system dependencies for MySQL client.
Copies and installs dependencies from
requirements.txt
.Copies the application code and sets up the working directory.
Exposes port 8000 for communication.
2. Nginx Reverse Proxy Dockerfile
Let’s create a separate directory named proxy
to store the custom nginx configuration. The custom nginx configuration file - nginx-default.conf
looks as follows:
server {
listen 80;
listen [::]:80;
server_name localhost;
location / {
proxy_pass http://django-app:8000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
Note: "django-app" is the name of the container used to host the Django app on port 8000.
The Nginx configuration ensures requests to port 80 are proxied to Django running on port 8000.
Let us now use this configuration file to create our Docker image.
FROM nginx:alpine
# Expose port 80
EXPOSE 80
# Copy custom Nginx config file
COPY nginx-default.conf /etc/nginx/conf.d/default.conf
# Start Nginx when the container has started
CMD ["nginx", "-g", "daemon off;"]
Docker Compose Setup
We define services in docker-compose.yaml
to orchestrate the deployment.
services:
db:
image: mysql:5.7
container_name: mysql-db
environment:
MYSQL_ROOT_PASSWORD: root
MYSQL_DATABASE: test_db
volumes:
- mysql_data:/var/lib/mysql
networks:
- myapp-network
django:
container_name: django-app
build:
context: ./
command: sh -c "python manage.py migrate --noinput && gunicorn notesapp.wsgi:application --bind 0.0.0.0:8000"
env_file:
- ".env"
depends_on:
- db
networks:
- myapp-network
expose:
- "8000"
proxy:
build:
context: ./proxy
container_name: nginx-proxy
ports:
- "80:80"
depends_on:
- django
networks:
- myapp-network
volumes:
mysql_data:
networks:
myapp-network:
driver: bridge
The database service (db
) runs MySQL 5.7 and persists data using a Docker volume (mysql_data
). It is assigned to a custom network (myapp-network
) for internal communication. The Django service (django
) builds from the current directory, applies database migrations on startup, and runs the application using Gunicorn. It uses an .env
file for environment variables and exposes port 8000 for internal communication. The Nginx proxy (proxy
) is built from the ./proxy
directory, listens on port 80, and forwards requests to Django. It depends on the django
service, ensuring the application is ready before starting. All services communicate over the myapp-network
, enabling seamless interaction between the database, application, and proxy.
The env file consists of variables for DB connectivity.
Running the Application
Now, it is time to deploy the application using docker-compose.yaml
file.
docker compose up --build
If everything is set up correctly, the application should work as follows:
Note: Make sure the security groups allow HTTP traffic. Port 8000 does not need to be exposed because we have already set up a reverse proxy using Nginx.
Automating Deployment with GitHub Actions
To streamline deployment, we set up a CI/CD pipeline using GitHub Actions with a self-hosted runner.
Setting Up a Self-Hosted Runner
Navigate to your repository on GitHub.
Go to Settings > Actions > Runners.
Click New self-hosted runner and follow the provided instructions to install it on your server.
Once registered, start the runner with:
./run.sh
Adding DockerHub Credentials via GitHub Secrets
To securely authenticate with Docker Hub, store your credentials as GitHub secrets:
Navigate to your GitHub repository.
Go to Settings > Secrets and variables > Actions.
Click New repository secret and add:
DOCKER_USERNAME
– Your Docker Hub username.DOCKER_PASSWORD
– Your Docker Hub access token (not the password, for security reasons).
Setting Up GitHub Actions
Create a .github/workflows/django-cicd.yml
file with the following content:
name: CI/CD Pipeline
on:
push:
branches:
- main # Trigger the workflow when code is pushed to the main branch
jobs:
build_and_deploy:
runs-on: self-hosted
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Run project
run: |
cd /home/ubuntu/django-notes-app
git pull origin main
- name: Log in to Docker Hub
uses: docker/login-action@v3.3.0
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Build and push Docker image
uses: docker/build-push-action@v6
with:
context: .
file: ./Dockerfile
push: true
tags: ${{ secrets.DOCKER_USERNAME }}/django-todo:latest
- name: Deploy with Docker Compose
run: |
cd /home/ubuntu/django-notes-app
docker compose pull
docker compose down
docker compose up -d
Now, update the django service in docker-compose.yaml
file to use the Docker Hub image instead of the build step.
django:
container_name: django-app
image: anantvaid4/django-todo:latest
command: sh -c "python manage.py migrate --noinput && gunicorn notesapp.wsgi:application --bind 0.0.0.0:8000"
env_file:
- ".env"
depends_on:
- db
networks:
- myapp-network
expose:
- "8000"
CI/CD Workflow Explained:
Triggers on push to
main
branch.Runs on a self-hosted runner (ideal for private deployments).
Pulls the latest code from GitHub.
Logs into Docker Hub securely using GitHub secrets.
Builds and pushes the Docker image to Docker Hub.
Deploys the updated containers using Docker Compose.
Further Enhancements
We successfully containerized a Django application with MySQL and Nginx using Docker. Additionally, we automated deployments with GitHub Actions and a self-hosted runner, making the process efficient and streamlined.
We can do following improvements to the project:
Deploying on Kubernetes using Kompose
DB Password can be secured using SOPS
Integrating monitoring tools like Prometheus using Helm
Resources
Subscribe to my newsletter
Read articles from Anant Vaid directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Anant Vaid
Anant Vaid
An Aspiring DevOps Engineer passionate about automation, CI/CD, and cloud technologies. On a journey to simplify and optimize development workflows.