Docker for DevOps Engineers
Docker Compose is a powerful tool for managing multi-container Docker applications. It simplifies defining and running multi-container Docker applications by allowing you to use a single YAML file to configure all the services, networks, and volumes your application needs. In this blog post, we'll dive deeper into Docker Volumes and Docker Networks, and provide practical examples to help you understand how to leverage these features for effective containerized applications.
Introduction to Docker Volumes
Docker Volumes provide a way to persist data generated by and used by Docker containers. Without volumes, data stored in a container is lost when the container is removed. Volumes solve this problem by storing data in a specific location on the host machine, separate from the container’s filesystem. This means the data persists even if the container is stopped or removed.
Benefits of Using Docker Volumes
Persistence: Data stored in volumes persists even if the container is deleted.
Sharing Data: Multiple containers can access and share the same volume.
Data Management: Volumes can be managed independently from containers.
Backup & Restore: Volumes can be easily backed up and restored.
Creating and Using Docker Volumes
Here’s how you can use Docker Volumes:
Creating a Volume
docker volume create my-volume
This command creates a named volume called
my-volume
.Using a Volume with
docker run
docker run -d -v my-volume:/data --name my-container alpine
In this command:
-d
runs the container in detached mode.-v my-volume:/data
mounts the volumemy-volume
to the/data
directory in the container.
Listing Volumes
dedocker volume ls
This lists all volumes on your system.
Inspecting a Volume
dedocker volume inspect my-volume
This command provides detailed information about the volume.
Removing a Volume
docker volume rm my-volume
This removes the specified volume.
Introduction to Docker Networks
Docker Networks allow you to create virtual networks for your containers, enabling them to communicate with each other and with the host machine. Networks are essential for setting up isolated environments for different applications or services while allowing them to interact.
Benefits of Using Docker Networks
Isolation: Networks isolate container traffic, improving security.
Service Discovery: Containers on the same network can communicate using container names.
Custom Network Configurations: You can define network settings and controls for container communications.
Creating and Using Docker Networks
Creating a Network
docker network create my-network
This command creates a new network called
my-network
.Connecting Containers to a Network
docker run -d --network my-network --name my-container1 alpine docker run -d --network my-network --name my-container2 alpine
Both containers are connected and can communicate with each other using their container names.
Listing Networks
docker network ls
This command lists all networks.
Inspecting a Network
docker network inspect my-network
This provides detailed information about the network.
Removing a Network
docker network rm my-network
This removes the specified network.
Task 1: Multi-Container Docker Compose File
In this task, we will create a Docker Compose file that brings up a multi-container application with an application container and a database container.
Creating docker-compose.yml
Here's a basic example of a docker-compose.yml
file that defines a web application and a MySQL database:
version: '3.8'
services:
web:
image: nginx:latest
ports:
- "8080:80"
networks:
- webnet
db:
image: mysql:5.7
environment:
MYSQL_ROOT_PASSWORD: example
networks:
- webnet
volumes:
- dbdata:/var/lib/mysql
networks:
webnet:
volumes:
dbdata:
In this file:
The
web
service uses thenginx
image and maps port 8080 on the host to port 80 in the container.The
db
service uses themysql
image and set a root password.Both services are connected to the
webnet
network.The
db
service uses a named volumedbdata
for persistent storage.
Commands to Manage the Multi-Container Application
Start the Application:
docker-compose up -d
This command starts the containers defined in
docker-compose.yml
in detached mode.Scale the Application:
docker-compose up -d --scale web=3
This command scales the
web
service to 3 replicas.View Container Status:
docker-compose ps
This command shows the status of all containers.
View Logs:
docker-compose logs web
This command shows logs for the
web
service.Stop and Remove Containers:
docker-compose down
This command stops and removes all containers, networks, and volumes defined in the
docker-compose.yml
file.
Task 2: Using Docker Volumes with Multiple Containers
In this task, we will create two containers that share a volume and verify data consistency.
Creating Containers with Shared Volume
Run Containers with Shared Volume
docker run -d --name container1 -v shared-volume:/data alpine docker run -d --name container2 -v shared-volume:/data alpine
Both containers use the
shared-volume
volume mounted to the/data
directory.Write Data to the Volume
docker exec container1 sh -c "echo 'Hello from container1' > /data/message.txt"
This command writes a message to a file inside the volume from
container1
.Read Data from the Volume
docker exec container2 cat /data/message.txt
This command reads the message from the file inside the volume from
container2
, verifying that both containers share the same data.List Volumes
docker volume ls
This command lists all volumes, including
shared-volume
.Remove the Volume
docker volume rm shared-volume
This command removes the volume when you are done.
Subscribe to my newsletter
Read articles from Rajat Chauhan directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
Rajat Chauhan
Rajat Chauhan
Rajat Chauhan is a skilled Devops Engineer, having experience in automating, configuring, deploying releasing and monitoring the applications on cloud environment. • Good experience in areas of DevOps, CI/CD Pipeline, Build and Release management, Hashicorp Terraform, Containerization, AWS, and Linux/Unix Administration. • As a DevOps Engineer, my objective is to strengthen the company’s applications and system features, configure servers and maintain networks to reinforce the company’s technical performance. • Ensure that environment is performing at its optimum level, manage system backups and provide infrastructure support. • Experience working on various DevOps technologies/ tools like GIT, GitHub Actions, Gitlab, Terraform, Ansible, Docker, Kubernetes, Helm, Jenkins, Prometheus and Grafana, and AWS EKS, DevOps, Jenkins. • Positive attitude, strong work ethic, and ability to work in a highly collaborative team environment. • Self-starter, Fast learner, and a Team player with strong interpersonal skills • Developed shell scripts (Bash) for automating day-to-day maintenance tasks on top of that have good python scripting skills. • Proficient in communication and project management with good experience in resolving issues.