The Cyber Arsenal: From Sabotage to Surveillance

HmadHmad
6 min read

When the world talks about war it conjures soldiers on battlefields. It evokes missiles and armies. But over the last two decades a quieter, more pervasive form of conflict has emerged. It is fought not with guns or bombs but with code. And its impact can be invisible, insidious, and yet utterly transformative.

This narrative follows two of the most consequential cyber weapons in modern history. One attacked physical infrastructure. The other turned everyday phones into spying devices. Both are real. Both changed the game. What they reveal is something unsettling: in the digital age, billions of people may already be on the front lines.

There is a Stuxnet documentary called Zero Days which I’d recommend watching if you’re interested. You can view it here.


Part One: Stuxnet, The Worm That Killed Centrifuges

In 2010, cybersecurity researchers began noticing something strange.

Industrial control systems that managed nuclear plants and factory lines were behaving oddly. A clue came from a small antivirus lab in Belarus, where a customer from Iran was reporting recurring system crashes. This was no ordinary malware outbreak. Experts at Symantec and Kaspersky identified an exceptionally sophisticated piece of code that had been infecting Windows systems for months. It packed multiple zero-day vulnerabilities and hidden rootkits.

What made Stuxnet unique was that it targeted not people but machines. Specifically, the programmable logic controllers made by Siemens that ran centrifuges at Natanz, Iran’s nuclear enrichment site. It searched for these devices, and if found, quietly changed their code to run centrifuges at irregular speeds until they broke. Meanwhile, it faked normal readings so operators wouldn’t notice.

In effect, software became a weapon in the traditional sense. Machines were physically destroyed before anyone could identify the cause.

Stuxnet did not remain trapped in its intended environment. It spread far beyond its original target, infecting an estimated 200,000 machines worldwide. Security investigators likened this runaway code to a precision guided missile that continued firing even after the mission had ended.

The question became pivotal. How do you control a weapon that is made of software and designed to spread invisibly?

The worm had likely been in development since 2005, part of a program code-named Olympic Games. It was a covert effort by the US and Israel to slow Iran’s nuclear program without using military force. It used no fewer than four zero-day exploits, signed drivers with stolen certificates, and included sabotage routines that triggered only when exact conditions were met.

It was, effectively, the world’s first cyber weapon to cause real world physical damage.


Part Two: Pegasus, The Spy in Your Pocket

In 2016, the NSO Group, an Israeli cyber intelligence company, introduced Pegasus.

It was not designed to destroy machines but to infiltrate the most personal object we own, the smartphone. Once installed, Pegasus granted full access to phone cameras, microphones, messages, call logs, photos, contacts, and location. It even bypassed encryption, making secure apps like WhatsApp and Signal vulnerable.

More disturbing was how Pegasus could be deployed. It did not require users to click on anything. This was the era of the zero click exploit. An invisible message could trigger the infection, and the user would never know.

Investigations by groups like Citizen Lab and Amnesty International uncovered widespread misuse. In India, journalists, opposition politicians, and activists were targeted. In Mexico, it was used against anti-corruption campaigners and even the families of murdered students.

Victims described a feeling of constant exposure, like their lives were broadcast live to unseen watchers. Phones were no longer personal devices. They were portable surveillance tools.

Pegasus was sold to governments under the premise of fighting crime and terrorism. But the controls were weak, and the accountability nearly non-existent. Investigations revealed Pegasus was used far beyond its stated purpose. Democracies and authoritarian regimes alike employed it to watch their own people.

This tool was not the product of a rogue hacker. It was a polished, commercial product, sold under license. Yet, it enabled abuses that raised profound questions about privacy, power and the future of digital freedom.


Part Three: From Sabotage to Surveillance

Stuxnet and Pegasus are not isolated incidents.

They represent a shift in how power is exercised through technology. One destroyed physical infrastructure. The other invaded personal space. Both required no physical presence. Both caused profound harm. And both were built by or for states.

Cyber weapons like these live in a grey area. They are powerful, effective, and hard to trace. They can cross borders silently. They can be reused or repurposed. And they often spread beyond their original targets.

When agencies build cyber weapons, they often fail to control them. Stuxnet spread globally. Pegasus was leaked, resold, and repurposed. This is the cyber weapon gap: a lack of accountability, a lack of regulation, and a growing potential for chaos.

Stuxnet was government built and deployed. Pegasus was government purchased and deployed. In both cases, the outcome was determined behind closed doors. Ordinary people, journalists, engineers, and activists paid the price.

This highlights a core issue of modern cybersecurity. The real threat may not be foreign hackers or criminal gangs. It may be the unchecked use of state sponsored tools designed to watch or destroy.


Part Four: Lessons for the Digital Age

Architecture Matters

Stuxnet proved that industrial systems can be hacked even when isolated. All it takes is one infected USB stick. Pegasus showed that no mobile device is safe, even when encrypted. Security architecture must evolve. Air gaps and firewalls are not enough. Trust is not a default. It must be continuously verified.

Regulation Is Urgent

Cyber weapons have outpaced international law. There are no Geneva Conventions for code. Export controls are weak. Transparency is rare. Regulation must catch up before the damage is irreversible.

Privacy Equals Security

Pegasus turned phones into tracking devices. That is not just a privacy violation. It is a security risk. For activists and journalists, it can be a matter of life and death. Protecting privacy is not about hiding. It is about ensuring that people can live and work without fear.

We Are All Targets

The era of cyber warfare is not coming. It is here. And it does not distinguish between combatants and civilians. If your infrastructure connects to the internet, you are a potential target. If your phone can be compromised, you are vulnerable. Cybersecurity is no longer a niche concern. It is a fundamental part of modern life.


The stories of Stuxnet and Pegasus are not history. They are warning signs. They show how invisible tools can create very visible harm. They reveal how power can be exercised quietly, efficiently, and devastatingly.

But they also offer an opportunity. An opportunity to push for transparency, accountability, and resilience. An opportunity to question who builds the tools, who uses them, and who they are used against.

In the end, the future of cybersecurity is not just technical. It is political, ethical, and deeply human. Because in a world where code can change everything, the most important thing may be who writes it, and why

 —  Hmad

0
Subscribe to my newsletter

Read articles from Hmad directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Hmad
Hmad

I'm a cybersecurity enthusiast with a growing focus on offensive security. Currently studying for the eJPT & ICCA, building hands-on projects like Infiltr8, and sharing everything I learn through blog posts and labs.