🐳 How I Reduced My Docker Image from ~1GB to ~200MB for a Python App


While containerizing a Python web application using Docker, I encountered a common problem: my image size had ballooned to almost 1 GB.
Before diving into optimization, let’s talk about why large Docker images are a real problem, especially when you're:
Building images in CI/CD locally
Deploying to cloud environments
Working remotely on slow networks, like mobile hotspots
A 1 GB+ image means:
Longer build and push times
Slower deployments
Wasted bandwidth and computing resources
Higher storage and transfer costs
After some cleanup and optimization, I reduced the image size to ~200MB — a massive improvement. Here’s exactly how I did it.
✅ Step 0: Create a .dockerignore
File
Before we optimize anything in the Dockerfile, the first step is to add a .dockerignore
file.
Why? Because Docker will copy everything in your project directory into the image unless you tell it not to, including things like .git
, virtual environments, build artifacts, and editor configs.
Here’s a basic example of a .dockerignore
:
Edit__pycache__/
*.pyc
*.pyo
*.pyd
*.db
*.sqlite3
*.log
*.env
.env*
venv/
.git/
.idea/
*.egg-info/
dist/
build/
This drastically reduces build context size and ensures cleaner images from the start.
⚙️ Step 1: Use a Slim Base Image
In my original Dockerfile, I used the full python:3.10.11
image. It’s convenient but includes a lot of unnecessary tools.
I switched to the slim version:
FROM python:3.10.11-slim
This one change cut down hundreds of megabytes from the image. But using slim
it also means explicitly installing build dependencies for some Python packages.
🧰 Step 2: Install Build Dependencies in Stage One
Some Python packages like psycopg2
, cryptography
, or numpy
require native libraries or compilation. So we install system-level build dependencies in a build stage:
# Stage 1: Build dependencies and install packages
FROM python:3.10.11-slim as builder
WORKDIR /app
# Install build dependencies
RUN apt-get update && apt-get install -y --no-install-recommends \
build-essential \
gcc \
libpq-dev \
&& rm -rf /var/lib/apt/lists/*
# Copy requirements file and install packages into a separate directory
COPY requirements.txt .
RUN pip install --upgrade pip \
&& pip install --prefix=/install-packages --no-cache-dir -r requirements.txt
We're installing all Python dependencies to /install-packages
later copy just the results into the final image, keeping it lean.
🏗️ Step 3: Multi-Stage Build for Minimal Final Image
Next, we create a second stage where we build the final runtime environment. This is what runs your app in production:
# Stage 2: Final runtime image
FROM python:3.10.11-slim
# Environment settings
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
WORKDIR /app
# Install runtime dependency required by psycopg2
RUN apt-get update && apt-get install -y --no-install-recommends \
libpq5 \
&& rm -rf /var/lib/apt/lists/*
# Copy only the necessary files
COPY --from=builder /install-packages /usr/local
COPY . /app/
# Run using Gunicorn
CMD ["gunicorn", "--config", "gunicorn_config.py", "wsgi"]
Key benefits of multi-stage builds:
No build tools in the final image
No source files, tests, or unused code
Only runtime dependencies are copied over
📉 Final Result
After applying these changes, here's the size comparison:
Stage | Image Size |
Initial Image | ~1GB |
Optimized Image | ~200MB |
💡 That's an 80% reduction in image size!
🔁 Recap
Here’s a quick recap of what we did to reduce the Docker image size:
✅ Added a
.dockerignore
file to shrink the build context⚙️ Switched to
python:3.10.11-slim
to avoid unnecessary bloat🧰 Installed build tools in a temporary stage and removed them later
🏗️ Used multi-stage builds to produce a clean, minimal final image
🔵 Original Dockerfile (Unoptimized)
FROM python:3.10.11
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
RUN mkdir /app
WORKDIR /app
COPY . /app/
RUN pip install --no-cache-dir -r requirements.txt
CMD ["gunicorn", "--config", "gunicorn_config.py", "wsgi"]
✅ Improved Dockerfile (Optimized)
# Stage 1: Build stage
FROM python:3.10.11-slim as builder
WORKDIR /app
# Install build dependencies
RUN apt-get update && apt-get install -y --no-install-recommends \
build-essential \
gcc \
libpq-dev \
&& rm -rf /var/lib/apt/lists/*
# Copy requirements and install packages into a separate folder
COPY requirements.txt .
RUN pip install --upgrade pip \
&& pip install --prefix=/install-packages --no-cache-dir -r requirements.txt
# Stage 2: Final runtime image
FROM python:3.10.11-slim
# Environment settings
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
WORKDIR /app
# Install runtime dependency (libpq5 needed by psycopg2)
RUN apt-get update && apt-get install -y --no-install-recommends \
libpq5 \
&& rm -rf /var/lib/apt/lists/*
# Copy installed Python packages from builder stage
COPY --from=builder /install-packages /usr/local
# Copy application code
COPY . /app/
# Start the application
CMD ["gunicorn", "--config", "gunicorn_config.py", "wsgi"]
Happy Learning 😊
#Docker #Python #DevOps #Dockerfile #WebDevelopment #PerformanceOptimization #Mutlistage build
Subscribe to my newsletter
Read articles from Jeevan directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
