Week - 2 : DevOps Zero to Hero

Pratik GotePratik Gote
13 min read

Day 8: DevOps Project using Shell Scripting, GitHub API Integration

In this blog, we will learn how to create a DevOps project using shell scripting and GitHub API integration. This project will help us to list users who have access to a specific repository.

Prerequisites:

  • Basic understanding of shell scripting

  • Basic understanding of GitHub API

Project Overview:

  • We will create a shell script that will interact with the GitHub API to list users who have access to a specific repository.

  • We will use curl command to interact with the GitHub API.

  • We will export our GitHub username and token to authenticate with the GitHub API.

  • We will use jq command to parse the JSON response from the GitHub API.

Steps to create the project:

  1. Create a new file called list_users.sh and open it in a text editor.

  2. Add the shebang line at the top of the file: #!/bin/bash

  3. Export your GitHub username and token: export USERNAME=<your_username> and export TOKEN=<your_token>

  4. Add the following code to the file:

#!/bin/bash

REPO_OWNER=$1
REPO_NAME=$2

API_URL="https://api.github.com/repos/$REPO_OWNER/$REPO_NAME/collaborators"

curl -H "Authorization: token $TOKEN" $API_URL | jq -r '.[] | .login'
  1. Save the file and exit the text editor.

  2. Make the script executable: chmod +x list_users.sh

  3. Run the script with the repository owner and name as arguments: ./list_users.sh <repository_owner> <repository_name>

Explanation:

  • We are using the GitHub API to list users who have access to a specific repository.

  • We are using the curl command to interact with the GitHub API.

  • We are passing the repository owner and name as arguments to the script.

  • We are using the jq command to parse the JSON response from the GitHub API and print the login name of each user.

Improvements:

  • Add comments to the script to explain what each part of the script does.

  • Add a helper function to check if the script is executed with the required arguments.

  • Add error handling to the script.

Conclusion:

In this blog, we learned how to create a DevOps project using shell scripting and GitHub API integration. We created a shell script that lists users who have access to a specific repository. We used the curl command to interact with the GitHub API and the jq command to parse the JSON response from the GitHub API. We also learned how to export our GitHub username and token to authenticate with the GitHub API. Finally, we discussed how to improve the script by adding comments, a helper function, and error handling.

Day 9: Git and GitHub

What is Version Control System?

  • A version control system is a system that helps you manage changes to code, documents, or other digital content over time.

  • It allows multiple developers to collaborate on a project by tracking changes and maintaining a history of modifications.

Problems with Sharing Code

  • Problem 1: Sharing code between developers

  • Problem 2: Versioning (keeping track of changes)

Centralized Version Control System (CVCS)

  • Example: SVN (Subversion)

  • All developers connect to a central server to access the code.

  • If the central server goes down, no one can access the code.

Distributed Version Control System (DVCS)

  • Example: Git

  • Each developer has a local copy of the code, and changes are shared through a distributed network.

  • If one developer's system goes down, others can still access the code.

Fork

  • Creating a copy of a repository to make changes without affecting the original repository.

  • Allows developers to collaborate on a project without affecting the main codebase.

Git

  • A distributed version control system that allows developers to track changes and collaborate on projects.

  • Git is a command-line tool that can be used to create a local repository, track changes, and share code with others.

Git Functionality:

  • Version Control: Manages changes to files, allowing multiple people to work on the same project simultaneously.

  • Local Repositories: Each user has a complete copy of the repository on their local machine.

  • Branching and Merging: Facilitates branching, merging, and collaborating on different parts of a project.

  • Commands: Common commands include git init, git add, git commit, git push, git pull, and git merge.

Git Commands

  • git init: Initializes a new Git repository in the current directory.

  • git add <file>: Stages a file for the next commit.

  • git commit -m "<message>": Commits changes with a message.

  • git log: Displays a log of all commits made to the repository.

  • git status: Displays the status of the repository, including any changes that need to be committed.

GitHub

  • A web-based platform for version control and collaboration.

  • Allows developers to create repositories, track changes, and collaborate on projects.

  • Provides features such as issues, pull requests, and project management.

GitHub Functionality:

  • Repository Hosting: Stores Git repositories in the cloud, allowing access and collaboration from anywhere.

  • Collaboration Tools: Provides features like pull requests, code reviews, issue tracking, project management boards, and wikis.

  • Social Coding: Facilitates collaboration and sharing through user profiles, followers, stars, and forks.

  • Continuous Integration: Integrates with CI/CD pipelines for automated testing and deployment.

  • Integration: Supports integrations with other tools and services such as Slack, Trello, and CI/CD pipelines.

Why GitHub?

  • GitHub is a popular platform for version control and collaboration due to its ease of use, scalability, and features.

  • It provides a centralized platform for developers to collaborate on projects and track changes.

Summary

  • Git: A tool used for tracking changes in your code, operating locally on your machine, and responsible for version control.

  • GitHub: A platform that hosts Git repositories online, offering additional features to facilitate collaboration and project management.

Day 10 : Git Branching Strategy

What is Git Branching Strategy?

  • A way to manage different versions of code in a Git repository

  • Helps to ensure that new features or fixes do not affect the existing codebase

  • Allows multiple developers to work on different features or fixes simultaneously

What is a Branch?

  • A separate line of development in a Git repository

  • Created to work on new features or fixes without affecting the main codebase

  • Can be merged back into the main codebase when complete

Why do we need a branching strategy?

  • To ensure that customer gets releases on time with new features and fixes.

  • To manage multiple contributors and changes to the codebase.

  • To ensure that new features and fixes don't affect the existing codebase.

Types of branches:

  • Master Branch (or Main Branch): The main branch where active development happens. The main codebase of the application

  • Feature Branches: Created for new features or changes to the existing codebase.

  • Release Branches: Created for releasing new versions of the codebase to customers.

  • Hotfix Branches: Created to quickly fix critical issues in the production environment.

How to use branches:

  1. Create a new feature branch from the master branch for new features or changes.

  2. Work on the feature branch and commit changes.

  3. Merge the feature branch back into the master branch when complete.

  4. Create a release branch from the master branch for releasing new versions.

  5. Perform testing and fixes on the release branch.

  6. Merge the release branch back into the master branch.

  7. Delete the feature branch and release branch once merged.

Example: Uber

  • Uber started as a cab application

  • They created a new feature branch to add bike functionality

  • They worked on the feature branch and merged it back into the main codebase when complete

  • They created a new release branch to prepare a new release of the application

  • They tested and validated the new release before deploying to production

Best Practices

  • Always create a new branch for new features or fixes

  • Use meaningful names for branches (e.g. feature/bike, release/v2, hotfix/fix-login-issue)

  • Merge branches regularly to ensure that the main codebase is up-to-date

  • Use release branches to prepare new releases of the application

  • Use hotfix branches to quickly fix critical issues in the production environment

Day 11 : Git Commands for DevOps Engineers

Initializing a Git Repository

• To initialize a new Git repository, navigate to the directory you want to track and run the following command:

$ git init

This will create a new .git directory in the current directory, which contains all the necessary metadata for Git to track changes.

Adding Files to the Repository

• To add a file to the Git repository, use the following command:

$ git add <file>

This will stage the file for the next commit. To add all changes, use:

$ git add.

Committing Changes

• To commit the staged changes, use the following command:

$ git commit -m "Commit message"

This will create a new commit with the specified message.

Cloning a Repository

• To clone an existing Git repository, use the following command:

$ git clone <repository URL>

This will create a local copy of the repository and all its history.

Branching

• Git allows you to create and switch between multiple branches. To create a new branch, use the following command:

$ git branch <branch name>

To switch to a different branch, use:

$ git checkout <branch name>

Merging Branches

• To merge one branch into another, use the following command:

$ git merge <branch name>

This will merge the changes from the specified branch into the current branch.

Rebasing Branches

• Rebasing is an alternative to merging that applies the changes from one branch onto another. To rebase a branch, use the following command:

$ git rebase <branch name>

This will apply the changes from the specified branch onto the current branch.

Pushing Changes to a Remote Repository

• To push your local changes to a remote repository, use the following command:

$ git push <remote name> <branch name>

Pulling Changes from a Remote Repository

• To pull changes from a remote repository, use the following command:

$ git pull <remote name> <branch name>

Cherry-Picking Commits

• Cherry-picking allows you to apply the changes from a specific commit onto the current branch. To cherry-pick a commit, use the following command:

$ git cherry-pick <commit hash>

Conclusion

• These are just a few of the many Git commands that are useful for DevOps engineers. By mastering these commands, you can effectively manage your codebase and collaborate with your team.

Day 12 : Deploying Your First Node Js Application on AWS EC2

Title : Deploying Your First Node.js Application on AWS EC2 : A Step-by-Step Guide

Introduction:

In this Blog, we'll explore the process of deploying a Node.js application on AWS EC2. We'll cover the necessary steps to set up an EC2 instance, install Node.js, and deploy our application on the cloud.

Prerequisites:

  • Basic knowledge of Node.js and AWS

  • An AWS account with access to EC2

  • A Node.js application to deploy

Want to try this project at your end ? Fork the GitHub repo https://github.com/verma-kunal/AWS-Se...

Deploying a Node.js Application on AWS EC2

Testing the Project Locally first

Before deploying our Node.js application on AWS EC2, let's test it locally to ensure everything is working as expected.

  1. Clone the project from GitHub:
git clone https://github.com/verma-kunal/AWS-Session.git
  1. Set up the environment variables in a .env file:
DOMAIN= ""
PORT=3000
STATIC_DIR="./client"

PUBLISHABLE_KEY=""
SECRET_KEY=""
  1. Initialise and start the project:
npm install
npm run start

Open a web browser and navigate to http://localhost:3000 to verify that the application is running locally.

Setting up an AWS EC2 Instance

Now that our application is running locally, let's set up an AWS EC2 instance to deploy it.

  1. Create an IAM user and log in to your AWS Console:

    • Access Type: Password

    • Permissions: Admin

  2. Create an EC2 instance:

    • Select an OS image: Ubuntu

    • Create a new key pair and download the .pem file

    • Instance type: t2.micro

  3. Connect to the instance using SSH:

ssh -i instance.pem ubuntu@<IP_ADDRESS>

Configuring Ubuntu on Remote VM

Once connected to the instance, let's configure Ubuntu on the remote VM.

  1. Update the outdated packages and dependencies:
sudo apt update
  1. Install Git: Guide by DigitalOcean

  2. Configure Node.js and npm: Guide by DigitalOcean

Deploying the Project on AWS

Now that our EC2 instance is set up, let's deploy our project on AWS.

  1. Clone the project in the remote VM:
git clone https://github.com/verma-kunal/AWS-Session.git
  1. Set up the environment variables in a .env file:
DOMAIN= ""
PORT=3000
STATIC_DIR="./client"

PUBLISHABLE_KEY=""
SECRET_KEY=""

For this project, we'll have to set up an Elastic IP Address for our EC2, which will be our DOMAIN.

  1. Initialise and start the project:
npm install
npm run start

Note: We will have to edit the inbound rules in the security group of our EC2 to allow traffic from our particular port.

Project is Deployed on AWS 🎉

Our Node.js application is now deployed on AWS EC2!

Day 13 : Top 15 AWS Services that Every DevOps Engineer should learn

Introduction:

As a DevOps engineer, it's essential to have a good understanding of various AWS services to improve efficiency, automation, and security in your organization. In this post, we'll cover the top 15 AWS services that every DevOps engineer should learn.

Introduction

As a DevOps engineer, it's essential to have a good understanding of various AWS services to improve efficiency, automation, and security in your organization. In this post, we'll cover the top 15 AWS services that every DevOps engineer should learn.

1. EC2

• EC2 is a fundamental service in AWS that provides virtual machines for computing and storage. As a DevOps engineer, you should have a good understanding of EC2 instances, including their types, pricing, and security.

2. VPC

• VPC (Virtual Private Cloud) is a service that allows you to create a virtual private cloud in AWS. You should know how to create and manage VPCs, subnets, security groups, and route tables.

3. EBS

• EBS (Elastic Block Store) is a service that provides block-level storage for EC2 instances. You should understand how to create and manage EBS volumes, including their types and pricing.

4. S3

• S3 (Simple Storage Service) is a service that provides object-level storage for static assets. You should know how to create and manage S3 buckets, including their security and access controls.

5. IAM

• IAM (Identity and Access Management) is a service that provides identity and access management for AWS resources. You should understand how to create and manage IAM users, roles, and policies.

6. CloudWatch

• CloudWatch is a service that provides monitoring and logging for AWS resources. You should know how to create and manage CloudWatch metrics, alarms, and logs.

7. Lambda

• Lambda is a service that provides serverless computing for AWS resources. You should understand how to create and manage Lambda functions, including their triggers and event handling.

8. Cloud Build Services

• Cloud Build Services include CodePipeline, CodeBuild, and CodeDeploy. You should know how to create and manage CI/CD pipelines using these services.

9. AWS Configuration

• AWS Configuration is a service that provides resource inventory, configuration history, and configuration rules. You should understand how to use AWS Configuration to manage and audit your AWS resources.

10. Billing and Costing

• You should have a good understanding of AWS billing and costing, including how to estimate costs, track usage, and optimize costs.

11. KMS

• KMS (Key Management Service) is a service that provides encryption and key management for AWS resources. You should know how to create and manage KMS keys, including their usage and security.

12. CloudTrail

• CloudTrail is a service that provides API logging and auditing for AWS resources. You should understand how to use CloudTrail to track API calls, identify security threats, and meet compliance requirements.

13. EKS

• EKS (Elastic Container Service for Kubernetes) is a service that provides managed Kubernetes clusters for containerized applications. You should know how to create and manage EKS clusters, including their security and networking.

14. ECS

• ECS (Elastic Container Service) is a service that provides container orchestration for containerized applications. You should understand how to create and manage ECS clusters, including their security and networking.

15. ELK Stack

• ELK Stack (Elasticsearch, Logstash, Kibana) is a service that provides logging, monitoring, and analytics for AWS resources. You should know how to create and manage ELK Stack, including their configuration and usage.

Conclusion

In this post, we covered the top 15 AWS services that every DevOps engineer should learn. By mastering these services, you can improve your skills and knowledge in AWS and become a more effective DevOps engineer.


_________________________________________________________________________________

0
Subscribe to my newsletter

Read articles from Pratik Gote directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Pratik Gote
Pratik Gote

About Me 👋 Greetings! I'm Pratik Gote, a dedicated software developer with a strong passion for staying at the forefront of technological advancements in the industry. Armed with a solid foundation in Computer Science, I excel in creating innovative solutions that push boundaries and deliver tangible results. What I Do I specialize in: Full-Stack Development: Crafting scalable applications using cutting-edge frameworks such as React, Vue.js, Node.js, and Django. 💻 Cloud Computing: Harnessing the capabilities of AWS, Azure, and Google Cloud to architect and deploy robust cloud-based solutions. ☁️ DevOps: Implementing CI/CD pipelines, Docker containerization, and Kubernetes orchestration to streamline development workflows. 🔧 AI and Machine Learning: Exploring the realms of artificial intelligence to develop intelligent applications that redefine user experiences. 🤖 My Passion Technology is dynamic, and my commitment to continuous learning drives me to share insights and demystify complex concepts through my blog. I strive to: Demystify Emerging Technologies: Simplify intricate ideas into accessible content. 📚 Share Practical Insights: Offer real-world examples and tutorials on state-of-the-art tools and methodologies. 🛠️ Engage with Fellow Enthusiasts: Foster a collaborative environment where innovation thrives. 🤝 Get in Touch I enjoy connecting with like-minded professionals and enthusiasts. Let's collaborate on shaping the future of technology together! Feel free to connect with me on LinkedIn : https://www.linkedin.com/in/pratik-gote-516b361b3/ or drop me an email at pratikgote69@gmail.com. Let's explore new horizons in technology! 🚀