Week - 1 : DevOps Zero to Hero

Pratik GotePratik Gote
13 min read

Day 1: Getting Started with DevOps

What is DevOps?

  • DevOps is a culture that improves the delivery process by following certain principles.

  • It's a practice that adopts automation, quality, monitoring, and testing to improve the delivery process.

  • DevOps is not just about delivery, but also about ensuring quality, automation, monitoring, and testing.

Definition of DevOps

  • DevOps is a process of improving the application delivery by ensuring there is a proper automation, quality, continuous monitoring, and continuous testing in place.

Why DevOps?

  • DevOps evolved to improve the process of delivery, which was slow and manual before.

  • Multiple teams were involved in the process, including system administrators, build and release engineers, and server administrators.

  • The process was slow because of manual effort and lack of automation.

How to Introduce Yourself as a DevOps Engineer

  • Introduce yourself as a DevOps engineer with your experience in DevOps (e.g., 4-5 years).

  • Mention your previous experience, if any (e.g., system administrator, build and release engineer, server administrator).

  • Explain your roles and responsibilities in your current organization, including automation, quality, monitoring, and testing.

Key Takeaways

  • DevOps is a culture that improves the delivery process.

  • DevOps is not just about delivery, but also about ensuring quality, automation, monitoring, and testing.

  • Introduce yourself as a DevOps engineer with your experience and previous background.


Day 2: Software Development Life Cycle (SDLC)

Introduction

  • Recap of Day 1: Introduction to DevOps

  • Importance of understanding SDLC for DevOps engineers

What is SDLC?

  • SDLC stands for Software Development Life Cycle

  • It's a process used by the software industry to design, develop, test, and deliver high-quality products

  • SDLC is a standard followed by every organization, whether it's a startup, MNC, or unicorn

Phases of SDLC

  1. Planning

    • Gathering requirements from customers and stakeholders

    • Defining project scope, goals, and timelines

    • Identifying resources and budget

  2. Designing

    • High-level design (HLD) and low-level design (LLD)

    • HLD: overall system architecture, scalability, and availability

    • LLD: detailed design of individual components and modules

  3. Building

    • Developing the application code

    • Writing automated tests and integrating with CI/CD pipelines

  4. Testing

    • Quality assurance (QA) engineers test the application

    • Testing for functionality, performance, security, and usability

  5. Deployment

    • Promoting the application to production

    • Ensuring smooth deployment and rollback processes

Importance of SDLC

  • Ensures high-quality products are delivered to customers

  • Improves collaboration and communication among teams

  • Reduces errors and bugs

  • Increases efficiency and productivity

DevOps Engineer's Role in SDLC

  • Primarily focused on automating and improving the efficiency of building, testing, and deployment phases

  • Ensures that the process is followed without manual intervention

  • Improves the overall efficiency of product delivery

Agile Methodology

  • An iterative approach to project management

  • Breaks down the project into smaller chunks and focuses on continuous improvement

  • Commonly used in software development projects

Key Takeaways

  • SDLC is a standard process followed by every organization

  • DevOps engineers play a crucial role in automating and improving the efficiency of building, testing, and deployment phases

  • Agile methodology is commonly used in software development projects


Day 3: Virtual Machines

Real-World Scenario: Efficient Use of Resources

  • Imagine a plane land where a house is built, and a family of 4-5 people is living there.

  • The family realizes that they only need half an acre of land, but they are wasting the other half.

  • To make efficient use of resources, they decide to build another property on the unused land and rent it out.

  • This way, they are using the resources efficiently, and two families can live on the same land without interfering with each other.

Server and Virtualization

  • A server is a computer that provides services over a network.

  • In the software industry, servers are used to deploy applications.

  • Example: A company, example.com, buys five physical servers from HP or IBM.

  • Each server has its own resources (CPU, RAM, etc.).

  • The company deploys an application on each server, but realizes that each application only uses a fraction of the server's resources.

  • This leads to inefficiency and waste of resources.

Virtualization

  • Virtualization is a concept that allows multiple virtual machines (VMs) to run on a single physical server.

  • A hypervisor is a software that creates and manages VMs on a physical server.

  • Popular hypervisors include VMware, XZen, and Hyper-V.

  • Virtualization allows for logical partitioning of resources, making it possible to create multiple VMs on a single physical server.

  • Each VM has its own CPU, RAM, and hardware, but they are logically isolated from each other.

Advantages of Virtualization

  • Increased efficiency: Multiple VMs can run on a single physical server, making efficient use of resources.

  • Logical isolation: Each VM is isolated from the others, ensuring that they don't interfere with each other.

  • Scalability: VMs can be easily created, cloned, or deleted as needed.

Cloud Providers and Virtualization

  • Cloud providers like Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform use virtualization to provide virtual machines to customers.

  • They build massive data centers with thousands of physical servers.

  • When a customer requests a virtual machine, the cloud provider's hypervisor creates a VM on one of the physical servers.

  • The customer is given access to the VM, but they don't have physical access to the server.

How Virtual Machines Work

  • A customer requests a virtual machine with specific resources (CPU, RAM, etc.).

  • The cloud provider's hypervisor finds an available physical server that meets the customer's requirements.

  • The hypervisor creates a VM on the physical server and provides the customer with access to the VM.

  • The customer can use the VM as if it were a physical server, but they don't have physical access to the server.


Day 4: AWS & Azure - How to Create Virtual Machines

Recap of Previous Day

  • In DevOps, to create a virtual machine, you make a request to a Cloud provider (e.g., AWS, Azure).

  • The Cloud provider responds with an IP address and specifications of the virtual machine.

Creating Virtual Machines

  • As a DevOps engineer, you should focus on efficiency and automation.

  • Manual creation of virtual machines is not efficient, especially when dealing with multiple requests.

  • Automation is key to improving efficiency and reducing errors.

AWS EC2 API

  • AWS provides an API (Application Programming Interface) for creating virtual machines.

  • The API receives a request, validates it, and responds with a virtual machine instance.

  • As a DevOps engineer, you can write a script to automate the creation of virtual machines using the AWS EC2 API.

Automation Options

  • AWS CLI (Command Line Interface): allows you to automate the creation of virtual machines using command-line commands.

  • AWS API: allows you to directly interact with the AWS API using programming languages like Python.

  • AWS CFT (CloudFormation Templates): a templating language that allows you to define infrastructure as code.

  • Terraform: an open-source tool that allows you to automate infrastructure creation across multiple Cloud providers.

Hybrid Cloud Model

  • Some organizations use a hybrid Cloud model, where they have virtual machines in one Cloud platform and other resources in another Cloud platform.

  • In this case, Terraform is a good choice for automation because it supports multiple Cloud providers.

Practical Demonstration

  • Creating a Virtual Machine on AWS

    • To create a virtual machine on AWS, you need to:

      1. Log into the AWS Console.

      2. Search for the EC2 service.

      3. Click on "Launch Instance".

      4. Provide the instance details (e.g., instance name, operating system).

      5. Choose a free tier eligible option to avoid charges.

      6. Create a key value pair (e.g., RSA, PEM) to log into the instance.

      7. Launch the instance.

  • Creating a Virtual Machine on Microsoft Azure

    • To create a virtual machine on Azure, you need to:

      1. Log into the Azure portal (portal.azure.com).

      2. Click on "Create a resource" or "Virtual machines".

      3. Provide the instance details (e.g., instance name, operating system).

      4. Choose a free tier eligible option to avoid charges.

      5. Launch the instance.

Key Takeaways

  • Automation is essential for efficient virtual machine creation.

  • AWS provides multiple options for automation, including AWS CLI, AWS API, and AWS CFT.

  • Terraform is a good choice for hybrid Cloud models.


How to Access EC2 Instances from a Windows Machine

Step-by-Step Notes:

  1. Create an EC2 Instance:

    • Go to the AWS Management Console and click on "Launch Instance"

    • Provide a name for the instance (e.g., "Test Windows")

    • Select the Ubuntu operating system and T2 micro instance type

    • Create a key value pair and download the.pem file

  2. Download MOBA Xterm:

    • Search for "MOBA Xterm" and download the Community Edition or Home Edition

    • Choose the installer option instead of the portable option

  3. Install MOBA Xterm:

    • Run the installer and follow the prompts to install MOBA Xterm
  4. Extract the MOBA Xterm Folder:

    • Go to the downloads folder and extract the MOBA Xterm folder
  5. Open MOBA Xterm:

    • Search for MOBA Xterm in the start menu and open it
  6. Create a New Session:

    • Click on "Sessions" and then "New Session"

    • Select "SSH" as the protocol

    • Enter the hostname (public IP address of the EC2 instance)

    • Enter the username (default is "ubuntu")

  7. Configure Advanced Shell Settings:

    • Click on "Advanced Shell Settings"

    • Select "Use private key" and browse to the location of the.pem file

  8. Connect to the EC2 Instance:

    • Click "OK" to connect to the EC2 instance

    • You will be prompted to accept the private key; click "Accept"

    • You should now be connected to the EC2 instance and can run commands to verify

Additional Tips:

  • Make sure to enable SSH in the security group settings for the EC2 instance

  • Use the.pem file instead of the.ppk file if you're using MOBA Xterm

  • MOBA Xterm is a better alternative to PuTTY for accessing EC2 instances from Windows machines


Day 5: Efficient Way of Creating Virtual Machines

  • Logging into Virtual Machine

    • Two ways to log into a virtual machine:

      1. Through AWS Console

      2. Through Terminal (CLI)

Logging into Virtual Machine through AWS Console

  • Go to EC2 dashboard

  • Select the instance you want to log into

  • Click on "Connect" and then "Connect" again

  • Establish a connection with the IP address

Logging into Virtual Machine through Terminal (CLI)

  • Need to have a terminal installed (e.g. iTerm, Putty, Mobile Xterm)

  • Use SSH command to connect to the virtual machine

  • Need to provide the public IP address and the key value pair (pem file)

  • Use chmod command to change the permissions of the pem file

Example Command:

ssh -i "demo1.pem" ubuntu@public_ip_address

Deleting an Instance

  • Go to EC2 dashboard

  • Select the instance you want to delete

  • Click on "Actions" and then "Instance State"

  • Click on "Stop" and then "Terminate"

AWS CLI

  • Command line interface to interact with AWS API

  • Need to download and install AWS CLI

  • Need to configure AWS CLI with access key ID and secret access key

Example Command:

aws configure

Creating an S3 Bucket using AWS CLI

  • Use aws s3 mb command to create an S3 bucket

  • Need to provide the bucket name and region

Example Command:

aws s3 mb s3://pratikgote-bucket --region us-east-1

CloudFormation Templates

  • Another way to automate the creation of resources on AWS

  • Use a template to define the resources you want to create

  • Can be used to create complex infrastructure

Example:

AWSTemplateFormatVersion: '2010-09-09'
Resources:
  MyEC2Instance:
    Type: 'AWS::EC2::Instance'
    Properties:
      ImageId: 'ami-abc123'
      InstanceType: 't2.micro'

Boto3

  • Python module to interact with AWS API

  • Need to install boto3 using pip

  • Can be used to automate the creation of resources on AWS

Example Code:

import boto3

ec2 = boto3.client('ec2')
response = ec2.describe_instances()
print(response)

Assignment:

  • Install AWS CLI and configure it with access key ID and secret access key

  • Create an S3 bucket using AWS CLI

  • Create an EC2 instance using AWS CLI


  • Day 6: Linux Operating System and Basics of Shell Scripting

    What is an Operating System?

    • An operating system acts as a bridge between software and hardware

    • It drives communication between software and hardware

    • Examples: Windows, Linux, macOS

Why Linux is Popular

  • Free and open-source

  • Secure

  • Fast

  • Widely used in production systems

Architecture of Linux Operating System

  • Kernel: heart of the Linux operating system, responsible for device management, memory management, process management, and handling system calls

  • System Libraries: perform tasks, e.g., libc

  • Compilers: compile code, e.g., GCC

  • User Processes: run applications

  • System Software: system-related software, e.g., system utilities

Basics of Shell Scripting

  • Shell: a way to talk to the operating system

  • Shell commands: used to navigate, create files and directories, and perform tasks

  • Popular shell commands:

    • pwd: present working directory

    • ls: list files and directories

    • cd: change directory

    • mkdir: create a directory

    • rm: remove a file or directory

    • touch: create a file

    • vi: create and edit a file

    • cat: print the contents of a file

Advanced Shell Commands

  • ls -ltr: list files and directories with timestamp, owner, and permissions

  • free: display memory usage

  • nproc: display the number of CPUs

  • df -h: display disk usage

  • top: display system performance and resource usage

Real-World Use Cases

  • Creating a file and writing to it

  • Creating a directory and navigating to it

  • Removing a file or directory

  • Checking system performance and resource usage

  • ________________________________________________________________________________

Day 7: Real-time Shell Script Project for DevOps Engineers

  • Introduction

    • The project is about creating a real-time shell script that tracks AWS resource usage.

    • This script is useful for DevOps engineers to monitor and report on AWS resource usage.

Why Move to Cloud Infrastructure?

  • Two primary reasons to move to cloud infrastructure:

    1. Manageability: Reduces maintenance overhead, as cloud providers manage servers and infrastructure.

    2. Cost-effectiveness: Pay-as-you-go model, only pay for resources used.

Project Overview

  • The script will track resource usage for EC2, S3, Lambda, and IAM users.

  • The script will generate a report every day at a specified time.

  • The report will be sent to a manager or stored in a reporting dashboard.

Prerequisites

  • AWS CLI installed and configured.

  • Basic knowledge of shell scripting and AWS CLI commands.

Scripting

  • The script will use Bash shell.

  • The script will have comments to explain what each section does.

  • The script will use AWS CLI commands to retrieve resource usage information.

Tracking Resource Usage

  • S3: Use aws s3 ls command to list S3 buckets.

  • EC2: Use aws ec2 describe-instances command to list EC2 instances.

  • Lambda: Use aws lambda list-functions command to list Lambda functions.

  • IAM: Use aws iam list-users command to list IAM users.

Improving the Script

  • Add print statements to improve user experience.

  • Use set -x command to enable debug mode.

  • Use jq command to parse JSON output and extract required information.

Integrating with Cron Job

  • The script will be integrated with a Cron job to run every day at a specified time.

  • The output will be redirected to a file called resource_tracker.

Here is the script:

    #!/bin/bash

    # Author: Pratik Gote
    # Date: 11th Jan
    # Version: 1

    # Track S3 buckets
    echo "List of S3 buckets:"
    aws s3 ls

    # Track EC2 instances
    echo "List of EC2 instances:"
    aws ec2 describe-instances | jq '.Reservations[].Instances[].InstanceId'

    # Track Lambda functions
    echo "List of Lambda functions:"
    aws lambda list-functions

    # Track IAM users
    echo "List of IAM users:"
    aws iam list-users

You can redirect the output to a file using > resource_tracker at the end of the script.


____________________________________________________________________________________

11
Subscribe to my newsletter

Read articles from Pratik Gote directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Pratik Gote
Pratik Gote

About Me ๐Ÿ‘‹ Greetings! I'm Pratik Gote, a dedicated software developer with a strong passion for staying at the forefront of technological advancements in the industry. Armed with a solid foundation in Computer Science, I excel in creating innovative solutions that push boundaries and deliver tangible results. What I Do I specialize in: Full-Stack Development: Crafting scalable applications using cutting-edge frameworks such as React, Vue.js, Node.js, and Django. ๐Ÿ’ป Cloud Computing: Harnessing the capabilities of AWS, Azure, and Google Cloud to architect and deploy robust cloud-based solutions. โ˜๏ธ DevOps: Implementing CI/CD pipelines, Docker containerization, and Kubernetes orchestration to streamline development workflows. ๐Ÿ”ง AI and Machine Learning: Exploring the realms of artificial intelligence to develop intelligent applications that redefine user experiences. ๐Ÿค– My Passion Technology is dynamic, and my commitment to continuous learning drives me to share insights and demystify complex concepts through my blog. I strive to: Demystify Emerging Technologies: Simplify intricate ideas into accessible content. ๐Ÿ“š Share Practical Insights: Offer real-world examples and tutorials on state-of-the-art tools and methodologies. ๐Ÿ› ๏ธ Engage with Fellow Enthusiasts: Foster a collaborative environment where innovation thrives. ๐Ÿค Get in Touch I enjoy connecting with like-minded professionals and enthusiasts. Let's collaborate on shaping the future of technology together! Feel free to connect with me on LinkedIn : https://www.linkedin.com/in/pratik-gote-516b361b3/ or drop me an email at pratikgote69@gmail.com. Let's explore new horizons in technology! ๐Ÿš€