Teaching Vector Embeddings to my MOM.

I once asked my mom, “How do you cook so perfectly without measuring anything? You always know exactly when and how much salt to add to food, sugar to chai, or hing to soup — and it always turns out amazing!”
For a moment, I thought I heard her say “vector embeddings” — but of course, she didn’t. She simply replied, “It’s because of my experience.”
That’s when it hit me — this is exactly how vector embeddings work!
In machine learning, vector embeddings are like a 3D map where every word is represented as a point in space. The direction and distance between these points show how related the words are. Words that are closer together on this map are more related in meaning, while those far apart are less related.
Just like my mom doesn’t need a recipe because her brain has built a “flavor map” from years of cooking, vector embeddings have their own “meaning map” built from lots of data. Experience teaches her where each ingredient fits best — just like data teaches embeddings where each word belongs in the graph.
When I told my mom where vector embeddings are used in GPT, I shared this link with her: GPT for a 5-Year-Old, where I explained GPT in a way even a child could understand.
So next time you enjoy perfectly balanced chai or soup, just remember — somewhere in that kitchen magic, there’s a human version of vector embeddings at work.
Subscribe to my newsletter
Read articles from Ashirbad Purohit directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
