From Deployment Disaster to Docker Success: My Journey with 5 Personal Projects

Abigeal AfolabiAbigeal Afolabi
5 min read

Introduction

Deployment failures can make or break a developer's confidence. After my first client project turned into a three-day debugging nightmare, I decided to master Docker through hands-on practice. Here's how 5 personal projects transformed my deployment process and professional reputation.

The Problem: When "Works on My Machine" Isn't Enough

My first paying client hired me to build an inventory management system using Node.js and MongoDB. The application ran flawlessly in my development environment, but deployment day became a disaster.

The issues I encountered:

  • Different Node.js versions between local and server

  • Missing system dependencies

  • Environment configuration inconsistencies

  • Database connection problems

Three days of troubleshooting later, I realized I needed a fundamental shift in my approach.

The Solution: Learning Docker Through Personal Projects

Instead of diving into complex tutorials, I created 5 progressively challenging personal projects to master Docker concepts systematically.

Project 1: Simple Todo Application

Technologies: Node.js, Express.js, SQLite
Docker Concepts Learned: Basic containerization, Dockerfile creation

FROM node:16-alpine
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 3000
CMD ["npm", "start"]

Key Insight: Understanding the difference between images and containers

Project 2: Personal Blog API

Technologies: Node.js, Express.js, MongoDB
Docker Concepts Learned: Multi-container applications, Docker Compose

version: '3.8'
services:
  api:
    build: .
    ports:
      - "3000:3000"
    environment:
      - MONGODB_URI=mongodb://mongo:27017/blog
    depends_on:
      - mongo

  mongo:
    image: mongo:4.4
    volumes:
      - blog_data:/data/db

volumes:
  blog_data:

Key Insight: Container networking and service communication

Project 3: File Upload Service

Technologies: Node.js, Multer, Sharp (image processing)
Docker Concepts Learned: Volume mounting, persistent data storage

FROM node:16-alpine
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
VOLUME ["/app/uploads"]
EXPOSE 3000
CMD ["npm", "start"]

Key Insight: Data persistence beyond container lifecycle

Project 4: Real-time Chat Application

Technologies: Node.js, Socket.io, Redis
Docker Concepts Learned: Advanced networking, service dependencies

version: '3.8'
services:
  chat:
    build: .
    ports:
      - "3000:3000"
    environment:
      - REDIS_URL=redis://redis:6379
    depends_on:
      - redis

  redis:
    image: redis:alpine
    command: redis-server --appendonly yes
    volumes:
      - redis_data:/data

volumes:
  redis_data:

Key Insight: Real-time applications in containerized environments

Project 5: Full-Stack E-commerce Platform

Technologies: React, Node.js, PostgreSQL, Nginx
Docker Concepts Learned: Complex orchestration, production deployment

version: '3.8'
services:
  frontend:
    build:
      context: ./frontend
      dockerfile: Dockerfile.prod
    depends_on:
      - backend

  backend:
    build: ./backend
    environment:
      - DATABASE_URL=postgresql://ecommerce:password@db:5432/ecommerce
    depends_on:
      - db

  db:
    image: postgres:13
    environment:
      - POSTGRES_DB=ecommerce
      - POSTGRES_USER=ecommerce
      - POSTGRES_PASSWORD=password
    volumes:
      - postgres_data:/var/lib/postgresql/data

  nginx:
    image: nginx:alpine
    ports:
      - "80:80"
    volumes:
      - ./nginx.conf:/etc/nginx/nginx.conf
    depends_on:
      - frontend
      - backend

volumes:
  postgres_data:

Key Insight: Production-ready containerized applications

The Transformation: Before vs After

Before Docker

  • Deployment success rate: ~30%

  • Average deployment time: 2-3 days

  • Client satisfaction: Low due to delays

  • Professional confidence: Shaken

After Docker

  • Deployment success rate: 95%+

  • Average deployment time: 20 minutes

  • Client satisfaction: High due to reliability

  • Professional confidence: Significantly improved

Best Practices I Discovered

1. Multi-stage Builds for Production

# Build stage
FROM node:16-alpine AS builder
WORKDIR /app
COPY package*.json ./
RUN npm ci --only=production

# Production stage
FROM node:16-alpine AS production
WORKDIR /app
COPY --from=builder /app/node_modules ./node_modules
COPY . .
RUN addgroup -g 1001 -S nodejs
RUN adduser -S nodejs -u 1001
USER nodejs
EXPOSE 3000
CMD ["npm", "start"]

2. Environment-Specific Configuration

# docker-compose.prod.yml
version: '3.8'
services:
  app:
    build:
      context: .
      dockerfile: Dockerfile.prod
    restart: unless-stopped
    environment:
      - NODE_ENV=production
    healthcheck:
      test: ["CMD", "curl", "-f", "http://localhost:3000/health"]
      interval: 30s
      timeout: 10s
      retries: 3

3. Security Considerations

  • Never run containers as root user

  • Use official base images

  • Regularly update dependencies

  • Implement health checks

  • Use secrets for sensitive data

Impact on My Development Career

Technical Benefits

  • Consistent deployment across environments

  • Simplified development setup for team members

  • Easier scaling and maintenance

  • Improved application reliability

Business Benefits

  • Faster project delivery

  • Increased client confidence

  • Ability to work with international clients

  • Professional presentation of technical skills

Lessons for Nigerian Developers

Working from Lagos, Docker provided specific advantages:

  • Reliable deployments despite inconsistent hosting environments

  • Professional presentation when competing with global developers

  • Efficient client demonstrations without physical meetings

  • Consistent development environments across different machines

Common Docker Mistakes to Avoid

  • Large image sizes - Use Alpine versions and multi-stage builds

  • Hardcoded configurations - Always use environment variables

  • Running as root - Create dedicated users for security

  • Ignoring health checks - Implement proper monitoring

  • Not using .dockerignore - Exclude unnecessary files

Getting Started: Your Docker Learning Path

Step 1: Choose Your First Project

Pick a simple application you've already built - preferably something with a database connection.

Step 2: Create a Basic Dockerfile

Start with the simplest possible configuration:

FROM node:16-alpine
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 3000
CMD ["npm", "start"]

Step 3: Test Locally

docker build -t my-app .
docker run -p 3000:3000 my-app

Step 4: Add Database Services

Create a docker-compose.yml file to orchestrate multiple containers.

Step 5: Deploy to Production

Use platforms like DigitalOcean App Platform, Railway, or Render for easy containerized deployments.

Conclusion

Learning Docker through personal projects transformed my approach to software deployment. The confidence gained from reliable, repeatable deployments has had a profound impact on my professional relationships and business success.

Key takeaways:

  • Start with simple personal projects

  • Build complexity gradually

  • Focus on solving real deployment problems

  • Practice regularly to build muscle memory

  • Document your learning journey

The journey from "works on my machine" to "works everywhere" is worth every hour invested in learning containerization.


What's your Docker learning story? Share your experiences and challenges in the comments below.

1
Subscribe to my newsletter

Read articles from Abigeal Afolabi directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Abigeal Afolabi
Abigeal Afolabi

๐Ÿš€ Software Engineer by day, SRE magician by night! โœจ Tech enthusiast with an insatiable curiosity for data. ๐Ÿ“ Harvard CS50 Undergrad igniting my passion for code. Currently delving into the MERN stack โ€“ because who doesn't love crafting seamless experiences from front to back? Join me on this exhilarating journey of embracing technology, penning insightful tech chronicles, and unraveling the mysteries of data! ๐Ÿ”๐Ÿ”ง Let's build, let's write, let's explore โ€“ all aboard the tech express! ๐Ÿš‚๐ŸŒŸ #CodeAndCuriosity