Sadservers Day2: Sherlock Holmes and the Case of the Noisy IP


After an on-call shift at work, it took some real strength to pick up Day 2 of Sadservers.com puzzle, but I’m glad I mustered up the courage. Here’s how it went:
The Problem Statement
There's a huge web server access log at /home/admin/access.log
, with thousands of entries, each line beginning with the IP address of the requester. Somewhere out there, one IP is hammering the server more than any other. My job for the day: become a log detective and catch the “noisiest” IP.
If you've ever managed a public-facing server, this scenario feels all-too-familiar. Sometimes it’s a mischievous bot, sometimes a misbehaving script, and sometimes it’s a genuine user who REALLY loves refreshing.
First Steps: Understanding the Evidence
The log file, /home/admin/access.log
, captured every HTTP request to the server. Each line began with an IP address, followed by the rest of the request details. Before sweeping in with solutions, I took a moment to peek at a few lines:
192.168.1.34 - - [19/Aug/2025:06:45:21 +0000] "GET /index.html HTTP/1.1" 200 1024
10.0.0.26 - - [19/Aug/2025:06:45:22 +0000] "POST /login HTTP/1.1" 302 512
192.168.1.34 - - [19/Aug/2025:06:45:23 +0000] "GET /dashboard HTTP/1.1" 200 2048
That first column was key.
The Game Plan: Unmask the Busiest Visitor
My goal was straightforward: identify the IP address making the most requests. The UNIX shell offers a perfect kit for this kind of forensic work, particularly with tools like awk
, sort
, uniq
, and a bit of clever piping.
Here’s the plan:
Extract just the IP addresses from the start of every line.
Count how many times each IP appears.
Sort the counts in descending order.
Grab the top offender.
My Solution: Piecing Together the Command
After a few test runs and tweaks, this is the magic one-liner I built:
awk '{print $1}' /home/admin/access.log | sort -nr | uniq -c | sort -nr -k 1 | head -n 2
Let’s unravel what each part does:
awk '{print $1}' /home/admin/access.log
Pulls out the first field (the IP address in our case) from every line in the log.sort -nr
This sorts the IP addresses extracted byawk
in numeric and reverse (descending) order.uniq -c
Counts how many times each unique IP appears and outputs count as the first column and the corresponding IP address as the second column.sort -nr -k 1
This sorts the lines in reverse numeric order, based on the first column (-k 1
) which is the count output byuniq -c
head -n 2
Displays the top results, with the champ first.
The output told me who the digital troublemaker was with the IP address in hand with total request count right next to it.
Why It Works: Lessons from the Trenches
The pipeline above is a template for all kinds of “most frequent” analyses: user agents, URLs, error codes, and more.
You don’t always need Python or log viewers; sometimes a few shell commands tell the whole story.
What’s Next?
Sadservers.com keeps giving me fun puzzles, real enough to remind me of past on-call stress (and victories). Today I was the log detective; tomorrow could be a config crime scene or a resource leak.
Stay tuned for Day 3, will it be a networking riddle, a permissions puzzle, or something even stranger?
If you’ve ever found a troublemaker in your logs, share your favorite shell command or war story in the comments!
Subscribe to my newsletter
Read articles from Muskan Agrawal directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Muskan Agrawal
Muskan Agrawal
Cloud and DevOps professional with a passion for automation, containers, and cloud-native practices, committed to sharing lessons from the trenches while always seeking new challenges. Combining hands-on expertise with an open mind, I write to demystify the complexities of DevOps and grow alongside the tech community.