Log Analyzer and Report Generator
Write a Bash script that automates the process of analyzing log files and generating a daily summary report. The script should perform the following steps:
Input: The script should take the path to the log file as a command-line argument.
Error Count: Analyze the log file and count the number of error messages. An error message can be identified by a specific keyword (e.g., "ERROR" or "Failed"). Print the total error count.
Critical Events: Search for lines containing the keyword "CRITICAL" and print those lines along with the line number.
Top Error Messages: Identify the top 5 most common error messages and display them along with their occurrence count.
Summary Report: Generate a summary report in a separate text file. The report should include:
Date of analysis
Log file name
Total lines processed
Total error count
Top 5 error messages with their occurrence count
List of critical events with line numbers
#!/bin/bash
# Check if log file path is provided as an argument
if [[ $# -eq 0 ]]; then
echo "Usage: $0 path/to/log/file"
exit 1
fi
# Set log file
log_file="$1"
# Check if the provided log file exists
if [[ ! -f "$log_file" ]]; then
echo "Error: log file not found"
exit 1
fi
# Variables to generate report
date=$(date)
total_line_processed=$(wc -l < "$log_file")
error_count=$(grep -E "ERROR|FAILED" "$log_file" | wc -l)
report_file="log_summary_report_$(date +%Y-%m-%d_%H-%M).txt"
echo "Analyzing log file: $log_file"
echo "Error Count: ${error_count}"
# Use awk to print line number with 'CRITICAL' messages
echo -e "\nCritical Events:"
awk '/CRITICAL/ {print NR, $0}' "$log_file"
# Identify the top 5 most common error messages
grep -E "ERROR|FAILED" "$log_file" | awk -F': ' '{print $NF}' | sort | uniq -c | sort -nr | head -n 5
echo "Generating report in $report_file"
{
echo "Date of Analysis: $date"
echo "Log File Name: $log_file"
echo "Total Lines Processed: $total_line_processed"
echo "Total Error Count: $error_count"
echo "Top 5 Error Messages:"
grep -E "ERROR|FAILED" "$log_file" | awk -F': ' '{print $NF}' | sort | uniq -c | sort -nr | head -n 5
echo "List of Critical Events with line numbers:"
awk '/CRITICAL/ {print NR, $0}' "$log_file"
} > "$report_file"
# Archive report file
archive_dir="/home/ubuntu/log_dir"
mkdir -p "$archive_dir"
zip "$archive_dir/report_file.zip" "$report_file"
echo "Report saved in $report_file and archived to $archive_dir."
exit 0
Summary of the Script:
Argument Check:
- The script checks if a log file path is provided as an argument. If not, it displays a usage message and exits.
Log File Validation:
- It verifies whether the provided log file exists. If the file is missing, it throws an error and exits.
Variables:
Date: Captures the current date.
Total Lines Processed: Counts the total number of lines in the log file.
Error Count: Uses
grep
to count occurrences of "ERROR" or "FAILED" in the log file.
Log Analysis:
Critical Events: Uses
awk
to print lines containing the word "CRITICAL", along with their line numbers.Top 5 Most Common Errors: Uses
grep
,awk
,sort
, anduniq
to identify and display the top 5 most frequent error messages.
Report Generation:
Generates a report summarizing the log analysis (date, log file name, total lines processed, error count, top errors, and critical events).
The report is saved in a file named
log_summary_report_<date>.txt
.
Archiving:
- The report file is zipped and saved in
/home/ubuntu/log_dir
.
- The report file is zipped and saved in
Output:
- Displays the location of the saved report and archive.
Key Commands:
grep
: Searches for specific patterns (e.g., "ERROR" or "CRITICAL").awk
: Extracts and prints specific parts of the log file, such as lines with critical events.wc -l
: Counts the total lines in the log file.sort
anduniq
: Sorts and counts unique error messages.zip
: Archives the report.
Subscribe to my newsletter
Read articles from Imran Shaikh directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by