Why Docker? Understanding the Need for Containers in Modern Development

raunak malhotraraunak malhotra
4 min read


Before diving into Docker, it’s important to understand why it was created. Let’s consider a common scenario in software development. Imagine two developers—Developer A and Developer B—working on different parts of the same application. When they try to run their combined code (let’s call this Case 1), it fails. Why? Because their development environments differ—different operating systems, mismatched library versions, or conflicting dependencies.

Now, consider Case 2: the development team packages the application and sends it to the testing team. Again, it fails to run—this time because the testing environment lacks the exact configuration used during development.

Both cases highlight a common issue: "It works on my machine" syndrome. That’s where Docker comes in.

Docker solves this problem by containerizing applications. A Docker container bundles your code along with all its dependencies, configuration files, and runtime environment—everything needed to run the application—into a lightweight, portable unit that behaves consistently across machines.


Why Do We Need Docker in Software Development?

Docker has become a vital tool in modern software development—and for good reason. Here's why:

  • Consistent Development and Testing Environments

Docker ensures that every developer and tester works in the exact same environment. This consistency eliminates the infamous “it works on my machine” problem and reduces bugs caused by environment differences.

  • Isolated Application Environments

Each Docker container runs in its own isolated environment. This prevents conflicts between applications and ensures that one app’s dependencies or crashes don’t affect others.

  • Effortless Cloud Deployment

Docker containers are lightweight and portable, making them ideal for cloud environments. Applications packaged in containers can be deployed quickly and easily on any cloud platform.

  • Efficient Resource Utilization

Docker allows multiple containers to run on a single machine without the overhead of traditional virtual machines. This efficient use of system resources helps reduce infrastructure costs while boosting performance.

Now, let's discuss some important terms related to Docker!


Docker engine

  • Docker Engine is the most important part of Docker because it manages the containers, creates them, and provides a runtime for these containers.

Components of docker engine:

  • Docker Daemon (dockerd)

    The Docker daemon is the heart of Docker. It runs as a background process and listens for Docker API requests. It is responsible for managing Docker objects such as containers, images, networks, and volumes. The daemon also handles container lifecycle operations like starting, stopping, and deleting containers.

  • Docker CLI

    The Docker Command Line Interface (CLI) is the primary way users interact with Docker. When you type commands like docker run or docker build, the CLI sends those commands to the Docker daemon via the REST API. It serves as a user-friendly interface for managing and automating Docker tasks.

  • Docker REST API

    The REST API allows the CLI (or other tools) to communicate with the Docker daemon programmatically. It’s the communication bridge that enables tools, scripts, or remote applications to interface with Docker, making it extensible and automatable.


Docker Images and Containers

  • Docker Images and Containers

    To understand Docker images and containers, think of object-oriented programming—specifically the relationship between classes and objects.

    In OOP, a class acts as a blueprint, defining the structure and behavior, while an object is a specific instance of that class. Similarly:

    • A Docker image is like a class. It’s a static blueprint that includes everything required to run an application—source code, dependencies, configuration files, environment variables, and more.

    • A Docker container is like an object—an active, running instance of an image. It executes the application in an isolated environment based on that image.

You create a Docker image using the docker build command, which reads instructions from a Dockerfile. (We’ll explore Dockerfiles in more detail in an upcoming article.)

This separation of blueprint and runtime instance allows for consistency, scalability, and efficiency in deploying applications across environments.


What’s Next?

In this article, you learned why Docker exists, how it solves real-world development problems, and what its core components are.

In the next articles, we’ll dive deeper into:

  • Writing your first Dockerfile

  • Using Docker Compose to manage multi-container applications

  • Advanced topics like volumes, networks, and orchestration

Stay tuned—your journey into containerization is just getting started!

1
Subscribe to my newsletter

Read articles from raunak malhotra directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

raunak malhotra
raunak malhotra