Getting Started with Docker: A Simple DevOps Project for Beginners
Greetings, DevOps enthusiasts! ๐ Ready to embark on a hands-on journey through Docker? In this blog, we'll take you through a step-by-step process of creating a Docker image for a simple web application, deploying it as a container, and even sharing your creation with the world. Let's dive right in! ๐โโ๏ธ
The Web App: A Glimpse of What's Ahead
Imagine we have a basic web application built using Node.js. It's a simple "Hello, Docker!" app that will greet you when you access it in your web browser.
Step 1: Crafting the Dockerfile
Our Docker journey starts with the Dockerfile, the blueprint for your image. Here's what it looks like:
# Use the official Node.js image as the base
FROM node:14
# Create and set the working directory
WORKDIR /usr/src/app
# Copy package.json and package-lock.json to the working directory
COPY package*.json ./
# Install app dependencies
RUN npm install
# Copy the app source code to the working directory
COPY . .
# Expose the port your app runs on
EXPOSE 3000
# Command to start the app
CMD ["node", "app.js"]
FROM node:14
: This line specifies the base image for your Docker image. In this case, it's using the official Node.js image from version 14. This base image provides the runtime environment needed to run your Node.js application.WORKDIR /usr/src/app
: This command sets the working directory inside the container where subsequent commands will be executed. In this case, it sets the working directory to/usr/src/app
.COPY package*.json ./
: This command copies thepackage.json
andpackage-lock.json
files from your local directory to the working directory in the container. These files are crucial for installing your application's dependencies.RUN npm install
: This command runs thenpm install
command inside the container. It installs the dependencies listed in the copiedpackage.json
file. This step ensures that your application has all the required libraries to run.COPY . .
: This line copies all the files from your local directory to the working directory in the container. This includes your application source code.EXPOSE 3000
: This command specifies that the container should expose port 3000. While this doesn't actually publish the port to the host machine, it's a helpful metadata declaration that other containers can use to know which ports to communicate on.CMD ["node", "app.js"]
: This is the command that will be executed when the container starts. In this case, it's running your Node.js application by invokingnode app.js
.
Step 2: Building the Image
Now, let's build the image using the following command:
docker build -t my-web-app .
docker build
: This command is used to build a Docker image from a Dockerfile. An image is a snapshot of a file system with all the necessary dependencies to run an application.-t my-web-app
: The-t
option is used to specify a name and optionally a tag for the image you're building. In this case,my-web-app
is the name you're giving to the image..
: The dot (.
) at the end of the command specifies the build context. The build context is the path to the directory containing the Dockerfile and any other resources needed during the build process.
Step 3: Running the Container
With the image ready, let's run a container:
docker run -d -p 8080:3000 my-web-app
docker run
: This is the basic command to start a new Docker container.-d
: This option stands for "detached mode." When you run a container in detached mode, it means the container runs in the background and doesn't tie up your terminal. You'll get your terminal prompt back immediately.-p 8080:3000
: This option maps a port from your host machine to a port in the container. It's done using the format-p HOST_PORT:CONTAINER_PORT
. In this case, it's mapping port 8080 on your host to port 3000 in the container. So, if you accesshttp://localhost:8080
in your browser, it will be directed to port 3000 in the container where your app is running.my-web-app
: This is the name of the Docker image you want to run as a container. It's important to have the image available on your system before running this command.
Putting it all together, the command docker run -d -p 8080:3000 my-web-app
starts a new Docker container in detached mode, maps port 8080 on your host machine to port 3000 in the container, and uses the my-web-app
image to create the container. This is a common way to start containers for web applications so you can access them from your browser.
Step 4: Verifying the App
Open your web browser and navigate to http://localhost:8080
.
Voilร ! Your "Hello, Docker!" app is up and running.
So, when you access http://localhost:8080
in a web browser, you're instructing the browser to use the HTTP protocol to communicate with a service or application running on the same device (localhost) on port 8080.
Step 5: Sharing Your Creation
Now, let's push your image to Docker Hub to share your masterpiece with others:
docker login
docker tag my-web-app your-dockerhub-username/my-web-app
docker push your-dockerhub-username/my-web-app
Run
docker login
and enter your Docker Hub credentials to authenticate yourself.After building your Docker image locally, use
docker tag
to give your image a name that includes your Docker Hub username.Finally, use
docker push
to push your tagged image to your Docker Hub repository, making it available for others to pull and use.
Celebrating Your Docker Victory! ๐
Congratulations, Docker virtuoso! ๐ฅณ You've successfully Dockerized your web app, built an image, ran a container, verified your app's functionality, and even pushed your creation to a repository. This journey showcases the power of Docker in simplifying application deployment and distribution.
So go ahead, wear your Docker badge proudly! You're now equipped to sail the seas of containerization with confidence. Happy coding! ๐ณ๐
Subscribe to my newsletter
Read articles from Ayushi Vasishtha directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
Ayushi Vasishtha
Ayushi Vasishtha
๐ฉโ๐ป Hey there! I'm a DevOps engineer and a tech enthusiast with a passion for sharing knowledge and experiences in the ever-evolving world of software development and infrastructure. As a tech blogger, I love exploring the latest trends and best practices in DevOps, automation, cloud technologies, and continuous integration/delivery. Join me on my blog as I delve into real-world scenarios, offer practical tips, and unravel the complexities of creating seamless software pipelines. Let's build a strong community of tech enthusiasts together and embrace the transformative power of DevOps!