How I Automated Log Analysis Using Grep, Awk, and Sed..

AWSomeVikashAWSomeVikash
3 min read

These three powerful commands—grep, awk, and sed—are essential tools in Linux for searching, extracting, and manipulating text data. Each serves a unique purpose, but they are often used together for text processing in files and terminal output.

grepGlobal Regular Expression Print

What grep does

  • Searches for specific patterns in files or output and prints matching lines or grep command basically searches for a pattern.

  • Extremely fast for searching large files or folders for keywords, names, or patterns.

Common Use Cases

  • Finding configuration values in system files.

  • Searching logs for errors or specific messages.

  • Filtering output from other commands.

Example: Let us have a log file which name is app.log then if we want to search for patterns then or if we search for check pass then

we type

grep "check pass" app.log

it shows all the check pass patterns

like that -

Jun 18 01:30:59 combo sshd(pam_unix)[31207]: check pass; user unknown

Jun 20 09:20:05 combo sshd(pam_unix)[10035]: check pass; user unknown

Jun 20 09:20:05 combo sshd(pam_unix)[10037]: check pass; user unknown

Jun 20 09:20:05 combo sshd(pam_unix)[10039]: check pass; user unknown

Jun 20 09:20:06 combo sshd(pam_unix)[10041]: check pass; user unknown

Jun 20 09:20:07 combo sshd(pam_unix)[10043]: check pass; user unknown

Jun 20 09:20:07 combo sshd(pam_unix)[10045]: check pass; user unknown

Jun 20 09:20:07 combo sshd(pam_unix)[10047]: check pass; user unknown

Jun 20 09:20:07 combo sshd(pam_unix)[10049]: check pass; user unknown

If you want the upper three lines, then we add:

grep "check pass" app.log | head -n 3

Then it prints only the head’s three lines:

Jun 18 01:30:59 combo sshd(pam_unix)[31207]: check pass; user unknown

Jun 20 09:20:05 combo sshd(pam_unix)[10035]: check pass; user unknown

Jun 20 09:20:05 combo sshd(pam_unix)[10037]: check pass; user unknown

awkPattern Scanning and Processing Language

What awk Does

  • Processes and extracts columns of text from structured files (like logs or CSVs).

  • Can perform calculations, formatting, and filtering of data.

  • Useful for reports, summaries, and picking out data fields easily.

Common Use Cases

  • Printing certain columns from a file.

  • Summing or counting values in columns.

  • Filtering rows according to text or number patterns.

Example:

If we want the entire log, then:

awk '{ print }' ap.log

It will print all the logs have app.log file, like there are given below:

Jun 20 03:40:59 combo ftpd[8829]: connection from 222.33.90.199 () at Mon Jun 20 03:40:59 2005

Jun 20 03:40:59 combo ftpd[8824]: connection from 222.33.90.199 () at Mon Jun 20 03:40:59 2005

Jun 20 03:40:59 combo ftpd[8828]: connection from 222.33.90.199 () at Mon Jun 20 03:40:59 2005

But if we want to print only specific details or columns, then:

awk '{print $1 $3 $4}' ap.log

Jun01:30:59combo

Jun01:30:59combo

Here $ is denoted to columns; $1 means it prints only the month:

Jun

101 Vikash Developer 50000

102 Anshuman Engineer 60000

awk '{ if ($4 > 55000) print $4 }' data.txt

output is 6000

sedStream Editor

What sed Does

  • Edits a stream of text: can substitute, delete, insert, or change text within a file or stream of text.

  • Performs find-and-replace on large files in a single command.

  • Can edit files in-place or output to a new file/terminal.

Common Use Cases

  • Replacing words or phrases across many lines.

  • Automated editing of configuration files.

  • Removing unwanted lines or numbering.

If the file has text like “hello my name is vikash and friend name is shivam”

and I want to replace vikash to shivam then

sed -i ’s/shivam/vikash/g' file.txt

output is

hello my name is vikash and friend name is vikash

0
Subscribe to my newsletter

Read articles from AWSomeVikash directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

AWSomeVikash
AWSomeVikash

🚀 Hi, I'm Vikash Patel — a passionate AWS & DevOps enthusiast, sharing my complete learning journey and real-world implementations. 📘 On this blog, I’m publishing a full DevOps + AWS roadmap — from basics to advanced, covering: 🟡 AWS Services: EC2, S3, IAM, CloudWatch, Billing, and more 🐧 Linux commands & scripting ⚙️ CI/CD pipelines with GitHub Actions & Jenkins 🧱 Infrastructure as Code using Terraform 📈 Monitoring, Alerts & Troubleshooting 💡 Every post is beginner-friendly — focused on clarity, practical use-cases, and hands-on solutions. 🌐 I’m also building my presence in the AWS Community, sharing what I learn, and learning from others. 🌱 Whether you're starting your cloud journey or looking for practical DevOps solutions, this blog is for you.