Day 1: Stepping into the World of Generative AI

Day 1 – Generative AI & Transformers
Learned what AI really is: not magic, but data-driven generation.
Generative AI = creating something new (not hard-coded). GPT by OpenAI made this mainstream.
Difference between high-level (user view) and low-level (internal working).
Transformer model (Google’s “Attention is All You Need”): Input → Transformer → Output.
Key concepts:
Tokens vs Sequences
Self-Attention & Multi-Head Attention
Vector Embeddings (turning words into numbers)
Transformer has Training Mode (learns) and Inference Mode (answers).
Special tokens:
<bos>
(beginning),<eos>
(end).Softmax + Temperature → control probability & creativity.
Assignments:
Build a tokenizer
Explain GPT to a child
Explain embeddings to mom
Explain tokenization to a fresher
📝 Diary Style (Casual, Human-Like)
Day 1 of AI Class
Today I started learning about Generative AI. First thing I realized: AI doesn’t actually “think”—it just generates based on patterns from data. Generative AI means it creates something new, unlike hard-coded systems. GPT, made by OpenAI, is the one that started this big wave.
We talked about low-level vs high-level views: one is how things work inside, and the other is what we see outside. Then I learned about Transformers—Google’s big idea from “Attention is All You Need.” The main trick is attention, and especially self-attention (words looking at other words in context). Multi-head attention is like looking at the same sentence from different angles.
I also learned about tokens and sequences (breaking text into parts), and vector embeddings (turning words into numbers so machines understand). Transformers have two modes: training (learning) and inference (answering). We even saw special tokens like <bos>
and <eos>
.
Finally, we discussed softmax and Temperature, which basically control how predictable or creative AI’s answers are.
My tasks: build a tokenizer, explain GPT to a 5-year-old, explain embeddings to my mom, and explain tokenization to a fresher. It was a lot, but super interesting…. #chaicode
Subscribe to my newsletter
Read articles from Prabhash kumar sah directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
