How was code invented?

Table of contents
- Evolution of Computers
- The Era of Early Electronic Computers (1940s)
- ENIAC (1945): The First General-Purpose Computer
- The Birth of Assembly Language (1950s)
- The Birth of High-Level Programming Languages (1950s–1960s)
- The Rise of Personal Computing & Modern Languages (1970s - 1980s)
- The Internet Revolution & Open Source Movement (1990s)
- Mobile Computing, AI, & Cloud Computing (2000s–2010s)
- The AI & Quantum Computing Era (2020s)
- So, How Did We “Discover” Code?
- Final Thoughts

A friend of mine once asked me out of curiosity:
“Jadesola, how did people discover code? Who invented it? And how did they… code the code?”
A bit of a spiral question, right? At the time, I gave an answer that, in hindsight, didn’t really answer anything. Mostly because I didn’t know, and admitting that didn’t feel like an option—lol.
But recently, I found myself thinking about it again. Who looked at a bunch of circuits and thought, “What if I could make this thing follow instructions?”. How did they figure out syntax, semantics, and all those things we now take for granted? Who wrote the first program, and what was going through their mind at that moment? So. Many. Questions.
This article is about the evolution of computers and programming—because you really can’t talk about coding without talking about computers. So, if you’ve ever wondered about the origins of programming, grab some popcorn, get comfortable, and let’s take a trip through time.
Evolution of Computers
When Computers Weren’t Really "Computers"
Before we talk about programming, let’s rewind to a time when the word computer didn’t mean a sleek laptop or a powerful server. Before the early 1800s, calculations were done using simple yet effective tools. The abacus, often considered the earliest known calculating device, was widely used alongside other manual tools like Napier's Bones, slide rules, and early mechanical calculators. Mathematicians such as Blaise Pascal and Gottfried Wilhelm Leibniz developed these devices to perform basic arithmetic operations, laying the groundwork for more advanced computing machines in the centuries to come.
The Analytical Engine: The First Computer That Never Was
In the 1830s, British mathematician Charles Babbage came up with a radical idea: “What if we built a machine that could calculate automatically?”. He designed the Analytical Engine, a mechanical machine with gears and levers that could store numbers, process data, and even make decisions based on conditions—essentially, a mechanical computer.
Unfortunately, due to funding issues (and the fact that 19th-century technology just wasn’t ready for it), the Analytical Engine was never fully built. But the idea planted the seed for modern computing.
The First Programmer: Ada Lovelace (1843)
Babbage was the hardware guy, but Ada Lovelace saw the bigger picture. She was a brilliant mathematician who realized that the Analytical Engine wasn’t just a calculator—it could follow instructions, making it the first programmable machine.
She wrote what is now considered the first-ever computer program, an algorithm to compute Bernoulli numbers. Lovelace even foresaw that computers could one day create music and art—a vision that took more than a century to come true.
And you know what? When I first learned that the world’s first programmer was a woman, I felt a ridiculous amount of pride—like, “Yes, Ada, we started this thing!” If only she could see me debugging JavaScript at 2 AM. 😭
Colossus (1943-1944)
During World War II, British engineers developed Colossus, the world’s first programmable digital computer, to break German codes. Instead of gears and levers, it used vacuum tubes (early electronic switches) to process data at incredible speeds.
The Era of Early Electronic Computers (1940s)
ENIAC (1945): The First General-Purpose Computer
While Colossus was designed specifically for codebreaking, the ENIAC (Electronic Numerical Integrator and Computer) was the first general-purpose programmable computer.
It weighed 30 tons, filled an entire room, and had 18,000 vacuum tubes that needed constant maintenance. Programming it wasn’t like today—you had to literally rewire the machine to change its instructions. Imagine having to physically rewire your phone every time you wanted to open a different app. That was early computing!
After ENIAC, computers became more reliable and programmable.
Electronic Discrete Variable Automatic Computer (1949) – EDVAC Introduced the concept of stored programs, where instructions were kept in memory rather than being manually reconfigured. This was a game-changer.
Universal Automatic Computer I (1951) – UNIVAC I being the first commercially available computer, marked the start of business computing.
The Birth of Assembly Language (1950s)
Early computers like ENIAC used machine language—binary code (1s and 0s). But writing programs in binary was painfully difficult, like trying to write a novel in Morse code.
Then came assembly language in the 1950s, a human-readable way to write instructions using simple words instead of raw numbers. Instead of typing 101010101
, you could write commands like ADD A, B
.
The Birth of High-Level Programming Languages (1950s–1960s)
Once people realized that coding could be simplified, a wave of programming languages started emerging:
FORTRAN (1957): The first widely-used high-level language, mostly for scientific calculations.
COBOL (1959): Designed for business applications, ensuring that companies didn’t have to rely on machine-specific code.
C (1972): The language that laid the foundation for many modern languages, including Python, Java, and JavaScript.
Python (1991): Created to be easy to read and write, helping developers focus on logic rather than syntax.
JavaScript (1995): The language that made websites interactive, paving the way for modern web applications.
Each new language made programming more accessible, moving us from machine code to human-friendly syntax. Today, you can even use no-code tools to build applications without writing a single line of code—something that would have seemed like science fiction 50 years ago.
This era saw computing shift from research labs to businesses and government agencies.
The Rise of Personal Computing & Modern Languages (1970s - 1980s)
Computers became more accessible, and programming languages improved.
Microprocessors (1971) – Intel’s 4004 chip revolutionized computing by enabling smaller, more affordable computers.
Personal Computers Begin
Altair 8800 (1975) – The first personal computer, programmed with switches and LEDs.
Apple I (1976) & Apple II (1977) – Made computing more user-friendly and accessible to individuals.
MS-DOS (1981) & Windows (1985) – Microsoft’s software helped standardize personal computing.
Object-Oriented Programming (OOP) Emerges
- C++ (1985) introduced the concept of objects and classes, making large programs more manageable.
Macintosh (1984) – Apple introduced the first computer with a graphical user interface (GUI), making computers more user-friendly.
This period marked the transition from mainframe dominance to personal computing.
The Internet Revolution & Open Source Movement (1990s)
The world became connected, and programming became more flexible.
World Wide Web (1991) – Tim Berners-Lee created the first web browser, leading to the rise of the internet.
Python (1991) – A high-level, easy-to-read programming language that grew in popularity over time.
Java (1995) – “Write once, run anywhere” made Java perfect for cross-platform applications.
JavaScript (1995) – The foundation of modern web development, enabling interactivity on websites.
Linux & Open Source (1991–2000s) – Open-source software became a movement, leading to collaborative innovation.
This period set the stage for modern web development and online services.
Mobile Computing, AI, & Cloud Computing (2000s–2010s)
Computers got smaller, smarter, and more connected.
Smartphones & Mobile Apps (2007–Present) – iPhone & Android changed how we interact with computers.
Cloud Computing (2010s) – Services like AWS and Google Cloud made computing more scalable and accessible.
Big Data & AI Boom
Machine Learning & Deep Learning (2010s) – Programming evolved to include AI-powered applications.
Python & R became key languages for data science and artificial intelligence.
JavaScript Frameworks (2010s) – React, Angular, and Vue.js made web development more dynamic.
The shift from desktop-first to mobile-first computing happened in this era.
The AI & Quantum Computing Era (2020s)
Computing is now more powerful, automated, and decentralized.
AI-Powered Coding – Tools like GitHub Copilot and ChatGPT assist developers.
Blockchain & Web3 – Decentralized applications (dApps) are changing how we think about the internet.
Quantum Computing – Companies like Google and IBM are exploring quantum computing for solving complex problems.
Low-Code/No-Code Movement – Platforms allow non-programmers to build applications visually.
So, How Did We “Discover” Code?
It wasn’t discovered like gravity or electricity. Instead, it evolved—step by step—as people figured out better ways to communicate with machines. From Ada Lovelace's algorithms to modern AI systems, programming has grown into one of the most powerful tools in human history.
Today, programming is everywhere—from the software in your phone to self-driving cars and AI chatbots. And the best part? The evolution is still ongoing. Who knows? Maybe in the future, writing code will be as simple as speaking to a computer in plain English.
But no matter how advanced things get, it all started with a simple idea: What if a machine could follow instructions?
Final Thoughts
So, if you’ve ever wondered who invented code and how they coded the first code, the answer is that it was a long, fascinating journey. Computers and programming didn’t appear overnight; they evolved through decades of innovation and problem-solving.
And just like that, I finally have an answer for my friend. Next time they ask, I won’t have to make something up—I'll just send them this article. 😄
Thank you for reading! If you found this helpful, give it a like… or ten, and don’t forget to follow for more content and share.
Subscribe to my newsletter
Read articles from Jadesola Adeagbo directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Jadesola Adeagbo
Jadesola Adeagbo
Hi🙋🏽♀️, I'm Jadesola, a software developer based in Nigeria 🛠️. Driven by a passion for solving problems with code, I'm currently refining my skills as a front-end developer while delving into the world of back-end development. I am dedicated to sharing my knowledge and experience as I grow in the tech world. Join me on my journey and let's grow together!