1st week

In this week i have completed basic fundamentals of devops. in which i covered the following topics :-
Git & GitHub
Bash & Terminal mastery
Virtual machines
SSH protocol (password & keypair authentication using public/private keys)
Hosting full-stack (MERN) projects
AWS fundamentals
Reverse proxies & process management
Streamlined development & development workflows
Let’s look at all the topics one by one
Git & GitHub :- Git is a distributed version control system (VCS) that helps developers track changes in their code over time. It allows teams to collaborate on code without overwriting each other’s work, making development more efficient, safe, and auditable.
GitHub is a cloud-based platform that hosts Git repositories. It adds collaboration tools like issue tracking, pull requests, and CI/CD integration on top of Git. It acts as a central hub for DevOps workflows.
In a DevOps environment, Git and GitHub play a foundational role in enabling Continuous Integration (CI) and Continuous Deployment (CD).
Bash & Terminal mastery :- Bash (Bourne Again SHell) is a command-line shell used to interact with Unix/Linux systems. It allows DevOps engineers to execute commands, automate tasks via shell scripts, and manage system operations efficiently.
Some basic Terminal commands are :-
| Command | Description | | --- | --- | | File and Directory Navigation | |
pwd
Show current working directory
ls
List files and directories
ls -l
Long list with details (permissions, size, etc.)
cd folder_name
Change to a directory
cd ..
Go one directory up
cd /
Go to root directory
cd ~
Go to home directory
mkdir folder
Create a new directory
rmdir folder
Remove empty directory
File Operations
touch filename.txt
Create a new empty file
cp source dest
Copy file or folder
mv oldname newname
Move or rename file/folder
rm filename.txt
Delete file
rm -r folder
Delete folder and its contents
cat file.txt
View file contents
nano file.txt
orvi file.txt
Edit file in terminal editor
User & Superuser
sudo
Run command with superuser rights
su
Switch user
adduser user
Add new user
passwd
Change password
Networking
ping
google.com
Check internet connection
ifconfig
orip a
Show IP address and interfaces
netstat -tulnp
Show open ports and services
Package Management (Debian/Ubuntu)
sudo apt update
Update package list
sudo apt upgrade
Upgrade installed packages
sudo apt install packagename
Install new software
sudo apt remove packagename
Remove installed software
curl url
Make web requests
Virtual machines :- A Virtual Machine (VM) is a software-based simulation of a physical computer. It runs an operating system and applications just like a real machine but is hosted on a physical computer (called the host) using a hypervisor.
SSH protocol :- SSH (Secure Shell) is a network protocol used to securely access and control remote machines over an unsecured network.
commads use:-
ssh username@host ssh -i ~/.ssh/id_rsa username@host
Hosting full-stack (MERN) projects :- MERN hosting is a key DevOps skill that combines deployment, configuration, and secure integration of a full-stack application.
AWS fundamentals :- One of the most fundamental skills in DevOps is deploying backend applications on cloud infrastructure. In this guide, we'll walk through how to launch a virtual server (EC2 instance) on Amazon Web Services (AWS) and deploy a Node.js backend application securely.
1. Launching an EC2 Instance
Start by heading to the AWS EC2 dashboard and clicking “Launch Instance”. Here, you’ll configure your virtual server:
Name your instance for clarity (e.g.,
node-backend-server
).Choose an Amazon Machine Image (AMI) — we’ll use Ubuntu 20.04 LTS.
Select an instance type like t2.micro, which is free-tier eligible.
Create or use a key pair to securely connect to the server via SSH.
Configure a security group to allow:
Port 22 (SSH) from your IP address
Port 3000 or 5000 (your Node.js app port) from anywhere
Port 80 if your app uses HTTP
2 . Connecting via SSH
Use your terminal to connect to the server:
bashCopyEditchmod 400 your-key.pem
ssh -i your-key.pem ubuntu@<your-ec2-public-ip>
This gives you shell access to the remote server.
✅ 3. Setting Up the Server Environment
Once inside the EC2 server:
- Update packages and install Node.js and npm:
bashCopyEditsudo apt update && sudo apt install nodejs npm -y
- (Optional) Install Git if you're pulling from a GitHub repository:
bashCopyEditsudo apt install git -y
- Clone your backend project:
bashCopyEditgit clone https://github.com/your-username/your-backend-repo.git
cd your-backend-repo
npm install
✅ 4. Configure Environment Variables
Set environment variables like database URI or JWT secret keys. You can either:
Use a
.env
file with thedotenv
package, orExport them directly in the terminal:
bashCopyEditexport MONGO_URI=your_mongo_url
export JWT_SECRET=your_jwt_secret
✅ 5. Run the Node.js Backend
Now start your backend server:
bashCopyEditnode index.js
Or, for a production-grade setup, install a process manager like pm2
:
bashCopyEditnpm install -g pm2
pm2 start index.js
This ensures your server stays running even if the terminal closes or crashes.
✅ 6. Accessing the Server
Open a browser and go to:
cppCopyEdithttp://<your-ec2-public-ip>:3000
You should see your backend API or homepage response
Reverse proxies & process management :- A reverse proxy is a server that sits in front of your application server. Instead of users directly accessing your Node.js backend, they interact with the reverse proxy, which forwards the request to your app behind the scenes.
How Reverse Proxies Work
Clients send requests to
example.com
A reverse proxy server (e.g., Nginx, Apache) listens on port 80/443
The proxy forwards the request to your backend running on, say, port 3000
The backend processes the request and returns a response to the proxy
The proxy then sends the final response back to the client
In Summary :-
A reverse proxy (like Nginx) manages incoming traffic and forwards it to your backend app — handling ports, HTTPS, and security.
A process manager (like PM2) ensures your app keeps running in the background and recovers from crashes.
Streamlined development & development workflows :- Streamlined development focuses on efficient, organized workflows to speed up software delivery and improve quality. Key elements include clear planning, using version control with branching strategies, and automated testing to catch bugs early. Continuous Integration and Deployment (CI/CD) pipelines automate builds and deployments, ensuring consistency. Code reviews and modular, reusable code enhance maintainability. Consistent development environments, enabled by tools like Docker, prevent setup issues. Documentation and communication keep teams aligned and informed. Agile methodologies support iterative progress and quick adaptation. Overall, streamlined workflows reduce errors, save time, and promote collaboration for faster, reliable software development.
Thanks For read… Stay connected
Subscribe to my newsletter
Read articles from Lav kushwaha directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
