Generative AI Explained in College Style: Like You're Pulling an All-Nighter with Maggi & Mental breakdowns

Arpit MohankarArpit Mohankar
4 min read

"Are we adding value?" — Bro, if it helps you pass viva, then yes.


🎓 Welcome to Your First Lecture on Generative AI

You're in class. It's 9 AM. You're half-asleep. Your laptop has 3% battery, your brain has 0% motivation, and the professor walks in like:

“Today’s topic is Generative AI.

You? Mentally still in bed.

So let’s break down this fancy tech stuff in the way you actually understand — with memes, Maggi, and mad deadlines.


💥 What is Generative AI?

Imagine a topper in your batch who not only remembers every textbook, lecture note, and Reddit discussion — but can also generate new answers, essays, code, and poems on the spot.

That's Generative AI.
It doesn’t just understand — it creates.

It’s the ChatGPT in your group project that does all the work while you just say, “Bro, you’re a life saver.”


📚 GPT: That One Friend Who’s Already Done the Assignment

GPT stands for:

Generative Pre-trained Transformer

Sounds like something from Marvel, but stay with me. Here’s the “hostel logic” breakdown:

  • Generative → It can create content. Essays, jokes, startup ideas, love letters — anything.

  • Pre-trained → Already trained on mountains of data (like that one guy who already solved last year’s paper).

  • Transformer → A type of deep learning model — kinda like your brain on caffeine and panic at 2 AM.


🔤 Tokens & Tokenization: Maggi Analogy 🍜

“Bhaiya, ek Maggi tod ke do na...”
That’s tokenization in real life.

Just like you break noodles into small bits before cooking, Tokenization breaks down sentences into smaller parts called tokens. It helps the AI understand language like:

“Bro, I need your help.” → [“Bro”, “,”, “I”, “need”, “your”, “help”, “.”]

AI doesn’t “read” like us — it chews on tokens and tries to make sense, just like we chew through last-minute assignments.


🧠 Vector Embeddings: Your Crush as a Coordinate

Every word gets turned into a number format called a vector. It’s like turning emotions into data.

Example:

"Love" → [0.13, 0.9, -0.3, 0.01, …]
"Hate" → [0.12, 0.88, -0.29, 0.00, …]

So the AI understands the vibe — not just the word. It’s like when you say "I'm fine" and your best friend knows you're not.


📍 Positional Encoding: Word Order Matters!

Without Positional Encoding, GPT would read:

“The dog chased the cat”
and
“The cat chased the dog”

as the same thing. But in real life — that’s like swapping question 1 and 10 in an exam. Total chaos.

So it adds position info to tokens, like roll numbers on exam sheets, so the meaning stays intact.


🔁 Self-Attention: The Gossip King👑

Self-Attention is like that one friend who remembers everything — who said what, where, and when in the group chat.

While processing a sentence, GPT looks at every word and decides how important it is. Just like how in:

“I never said she stole my notes.”

Depending on which word you emphasize, the meaning changes — and GPT catches that. Drama unlocked.


🧪 Training: How GPT Went from First Bench to Genius

Training a model = feeding it billions of text examples until it starts making sense. Like:

  • Input: “Roses are red…”

  • AI: “Violets are blue…”

At first, it'll say stuff like:

“Roses are red... potatoes are crunchy”

But over time (and many GPUs burned alive), it learns to predict better — just like you studying one day before the exam and somehow surviving.


🔍 Inferencing: How It Answers Your Questions

Once trained, GPT is ready for inference — aka when you ask it something and it answers like a pro.

You type:

“Write a breakup message in Shashi Tharoor English.”

GPT:

“Our cognitive wavelengths no longer align within the same ontological bandwidth, hence, let’s disengage emotionally.”

Legend. 🙇‍♂️


🤯 Real Talk: Why Should You Care?

Because bro — this is the future. From automating assignments to building actual startups, Generative AI is your tech genie.

Use it to:

  • Write better resumes

  • Build smarter projects

  • Generate ideas

  • Roast your friends in 10 different programming languages


🎤 TL;DR for the Backbenchers

  • Generative AI = Smart AF bot that can create stuff

  • GPT = Hermione with CPU and no sleep

  • Tokens = Words broken into bite-sized Maggi pieces

  • Vector Embedding = Feeling-to-number conversion

  • Self-Attention = Gossiping algorithm

  • Training = Internet + GPUs = Smart model

  • Inferencing = Now you talk to it like your lab partner


💬 So… Are We Adding Value?

If you laughed, learned, or realized your next AI project idea — then 100% yes.
Now go flex this AI gyaan in your next group project like a final-year king/queen 👑.


Made with ☕, Maggi, and minimal sleep by a fellow over-caffeinated student.

0
Subscribe to my newsletter

Read articles from Arpit Mohankar directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Arpit Mohankar
Arpit Mohankar