Day-3 | Linux Advanced | Part-2

Dhruv RajvanshiDhruv Rajvanshi
5 min read

Hello, tech enthusiasts! 🌟

Welcome back to Day 3 of my DevOps learning journey! Today, we're diving deeper into the world of Linux, focusing on log file analysis and powerful text-processing commands. These tools are essential for filtering, searching, and manipulating data, making system administration more efficient and effective. Let’s get started!

1. grep - Search for Patterns

The grep command is a powerful tool for searching text files for lines that match a given pattern. It’s widely used for filtering and extracting specific information from large files.

  • Usage Examples:

    • grep 'pattern' filename → Search for lines containing 'pattern' in filename.

    • grep 'ERROR' logfile.txt → Find all lines with the word "ERROR" in logfile.txt.

    • grep -r 'pattern' /path/to/directory → Recursively search for 'pattern' in all files under the specified directory.

grep helps quickly locate and analyze specific data points within files.

2. find - Locate Files and Directories

The find command allows you to search for files and directories based on various criteria. It’s incredibly useful for locating files without knowing their exact paths.

  • Usage Examples:

    • find /path/to/directory -name "filename" → Find files named "filename" in the specified directory.

    • find /var/log -type d -name "log" → Locate directories named "log" within /var/log.

    • find /home/user -mtime -7 → Search for files in /home/user modified in the last 7 days.

find provides flexibility in locating files based on numerous attributes.

3. awk - Pattern Scanning and Processing

awk is a powerful programming language designed for pattern scanning and processing. It’s ideal for working with structured data like CSV files.

  • Usage Examples:

    • awk '{print $1}' filename → Print the first column from each line in filename.

    • awk '/pattern/ {action}' filename → Perform 'action' on lines matching 'pattern'.

    • awk '{sum += $2} END {print sum}' file.txt → Sum up values in the second column of file.txt and print the result.

With awk, you can extract, manipulate, and summarize data efficiently.

4. sed - Stream Editor for Text Transformation

sed is a stream editor used for filtering and transforming text. It’s particularly useful for batch editing and applying text substitutions.

  • Usage Examples:

    • sed 's/old/new/' filename → Replace the first occurrence of "old" with "new" in filename.

    • sed -i 's/old/new/g' filename → Replace all occurrences of "old" with "new" in filename in-place.

    • sed -n '5,10p' filename → Print lines 5 through 10 of filename.

sed is invaluable for performing automated text edits and transformations.

Difference between grep, find, awk, sed.

CommandPurposeKey FeaturesUse Case
grepSearch for PatternsSearches for lines matching a pattern in files.Finding specific text within log files.
findLocate Files and DirectoriesSearches for files and directories based on criteria like name, type, and modification time.Locating a file in a directory or finding files modified recently.
awkPattern Scanning and ProcessingProcesses and analyzes text, often used for extracting and summarizing data.Extracting columns from structured data or calculating sums.
sedStream EditorPerforms text transformations like substitutions and deletions.Editing and replacing text in files or streams.

Understanding the mount Command

The mount command in Linux is used to attach a filesystem or storage device to a directory in the filesystem tree. This process allows the contents of the storage device to be accessed through that directory. When you mount a device, you're making its filesystem accessible to the operating system, allowing you to read from and write to it.

Steps to Mount an EBS Volume in Ubuntu Linux

Step 1: Attach the EBS Volume to Your Instance

  1. Log in to AWS Management Console:

    • Open your web browser and go to the AWS Management Console.
  2. Navigate to EC2 Dashboard:

    • Click on the "EC2" service to access the EC2 Dashboard.
  3. Select Volumes:

    • In the left-hand menu, choose “Volumes” under the “Elastic Block Store” section.
  4. Attach the Volume:

    • Select the EBS volume you wish to attach.

    • Click on the "Actions" dropdown menu and select "Attach Volume".

    • Choose the instance you want to attach the volume to from the list.

    • Specify the device name (e.g., /dev/xvdf).

    • Click the "Attach" button to complete the process.

Step 2: Connect to Your Linux Instance

  1. SSH into Your Instance:

    • Open an SSH client (like PuTTY on Windows or Terminal on macOS/Linux).

    • Connect to your instance using its public DNS and your SSH key file.

Step 3: Identify the EBS Volume

  1. List Available Disks:

    • Use the lsblk command to view the list of available disks and identify the newly attached EBS volume.

Step 4: Create a Filesystem on the Volume

  1. Format the EBS Volume:

    • Format the volume if it’s not already formatted with the mkfs command (e.g., sudo mkfs -t ext4 /dev/xvdf).

Step 5: Create a Mount Point

  1. Create a Directory:

    • Make a directory where you will mount the EBS volume (e.g., sudo mkdir /mnt/my-ebs-volume).

Step 6: Mount the EBS Volume

  1. Mount the Volume:

    • Attach the volume to the directory using the mount command (e.g., sudo mount /dev/xvdf /mnt/my-ebs-volume).

Step 7: Verify the Mount

  1. Check Mounted Filesystems:

    • Use the df -h command to ensure the volume is mounted correctly and visible at the mount point.

Conclusion

Today's focus on log file analysis, advanced text-processing commands, and EBS volume management has provided us with essential skills for effective system administration and data manipulation. Mastering these tools will greatly enhance your ability to manage and troubleshoot Linux systems.

Stay tuned for more insights and discoveries as I continue my DevOps learning journey. Let’s keep advancing our skills together!

10
Subscribe to my newsletter

Read articles from Dhruv Rajvanshi directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Dhruv Rajvanshi
Dhruv Rajvanshi