Vector Embeddings: Even your mom can understand this

Aman VijayAman Vijay
3 min read

Ever since AI tools like GPT, ChatGPT, and others have been making headlines, you’ve probably heard tech people throw around mysterious terms like “vector embeddings.”
Now, if you’re like my mom, your first reaction is probably:

“Beta, why do you people make simple things sound so complicated?”

Fair enough. So today, let’s throw out the jargon and talk about vector embeddings like we’re having tea in the kitchen.


First things first — what on earth is a vector embedding?

Think of it this way:
Imagine you have a big spice rack in the kitchen with hundreds of jars. Each spice has a specific taste — salty, sweet, spicy, sour, bitter.

Now, suppose I ask you:

“Find me something that tastes kind of like cinnamon.”

You won’t check every jar randomly. Instead, in your mind, you know cinnamon is sweet and warm, so you’ll quickly look for other spices with a similar “taste profile.”

That’s basically what vector embeddings do — except instead of taste, they deal with meaning.


The “taste profile” of words

Every word, sentence, or even an entire paragraph can be turned into a list of numbers — like a recipe card — that captures its “flavor” of meaning.
This list of numbers is called a vector.

If two words have similar meanings, their number lists will look similar too, just like cinnamon and nutmeg share a similar taste profile.


Why AI cares so much about these numbers

Let’s say I ask an AI:

“Find me documents that talk about cooking pasta.”

The AI doesn’t search by matching the exact words “cooking pasta”. Instead, it converts my query into a vector, then looks for other vectors (documents) that have a similar “meaning flavor.”

This is why AI can still find helpful answers even if the words don’t match exactly. For example:

  • I type “How to boil spaghetti”

  • The AI still finds articles about “pasta preparation”
    Because the meaning is close, even though the words are different.


Where you’ve already seen vector embeddings without realizing it

If you’ve used:

  • Google search (finding results that don’t exactly match your query)

  • Netflix recommendations (finding shows “like” the one you watched)

  • Shopping sites suggesting “similar” products

…then congratulations — you’ve already benefited from vector embeddings without ever hearing the term.


But why not just use normal keywords?

Good question, Mom. Keywords are like giving directions with only street names — if the name changes, you’re lost.
Vector embeddings are like giving directions with GPS coordinates — even if the name changes, the location is still accurate.


Wrapping it up

So, vector embeddings are just a fancy way of saying:

“We turn words into numbers so computers can understand meaning, not just exact words.”

That’s it. No rocket science (unless you’re actually building the AI).

And now, Mom, next time someone mentions “vector embeddings,” you can sip your tea and say:

“Ah yes, the spice rack of AI.”
…and watch their jaw drop.

0
Subscribe to my newsletter

Read articles from Aman Vijay directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Aman Vijay
Aman Vijay

Full Stack Developer