Challenge Title: Log Analyzer and Report Generator
Scenario
As a system administrator, managing a network of servers is a critical part of your job. Every day, each server generates a log file containing essential system events and error messages. Analyzing these logs is vital for maintaining the health and security of the servers. To streamline this process, we'll create a Bash script that automates log analysis and report generation.
Task
Your task is to write a Bash script that automates the process of analyzing log files and generating a daily summary report. The script should perform the following steps:
Input: The script should take the path to the log file as a command-line argument.
Error Count: Analyze the log file and count the number of error messages. An error message can be identified by a specific keyword (e.g., "ERROR" or "Failed"). Print the total error count.
Critical Events: Search for lines containing the keyword "CRITICAL" and print those lines along with the line number.
Top Error Messages: Identify the top 5 most common error messages and display them along with their occurrence count.
Summary Report: Generate a summary report in a separate text file. The report should include:
Date of analysis
Log file name
Total lines processed
Total error count
Top 5 error messages with their occurrence count
List of critical events with line numbers
Optional Enhancement: Add a feature to automatically archive or move processed log files to a designated directory after analysis.
Tips
Use
grep
,awk
, and other command-line tools to process the log file.Utilize arrays or associative arrays to keep track of error messages and their counts.
Use appropriate error handling to manage cases where the log file doesn't exist or other issues arise.
Sample Log File
A sample log file named sample_log.log
has been provided in the same directory as this challenge file. You can use this file to test your script or use your own log files.
Real-Life Example
Imagine you are working for a company that runs a web application across multiple servers. Every day, you need to ensure that the application is running smoothly by checking the server logs for any errors or critical issues. Manually inspecting these logs would be time-consuming and error-prone. Automating this process with a script saves you time and ensures that no important events are missed.
Script: log_analyzer.sh
#!/bin/bash
if [ -z "$1" ]; then
echo "Usage: $0 <path_to_log_file>"
exit 1
fi
log_file=$1
report_file="summary_report_$(date +%Y%m%d).txt"
if [ ! -f "$log_file" ]; then
echo "Log file not found!"
exit 1
fi
total_lines=$(wc -l < "$log_file")
error_count=$(grep -cE "ERROR|Failed" "$log_file")
critical_events=$(grep -n "CRITICAL" "$log_file")
declare -A error_messages
while IFS= read -r line; do
if [[ "$line" =~ ERROR|Failed ]]; then
error_message=$(echo "$line" | awk -F']' '{print $NF}')
((error_messages["$error_message"]++))
fi
done < "$log_file"
top_errors=$(for message in "${!error_messages[@]}"; do
echo "${error_messages[$message]} $message"
done | sort -rn | head -n 5)
echo "Date of analysis: $(date)" > "$report_file"
echo "Log file name: $log_file" >> "$report_file"
echo "Total lines processed: $total_lines" >> "$report_file"
echo "Total error count: $error_count" >> "$report_file"
echo "Top 5 error messages with their occurrence count:" >> "$report_file"
echo "$top_errors" >> "$report_file"
echo "List of critical events with line numbers:" >> "$report_file"
echo "$critical_events" >> "$report_file"
# Optional: Move processed log file to archive directory
archive_dir="log_archive"
mkdir -p "$archive_dir"
mv "$log_file" "$archive_dir/"
echo "Analysis complete. Report generated: $report_file"
Conclusion
Automating log analysis with a Bash script not only saves time but also ensures consistency and accuracy in identifying and reporting critical events and errors. This script serves as a practical tool for system administrators to maintain server health and quickly respond to issues. By incorporating this script into your daily routine, you can focus more on proactive measures and less on manual log inspection.
Learn More
Check out this YouTube tutorial for a deeper dive into shell scripting:
Shell Scripting Tutorial for Beginner
Connect and Follow Me on Socials: