Vector Embeddings - explained to your Mom

Parth TutejaParth Tuteja
2 min read

Hi mom,

What if I told you I can taste the chai you made without having a sip of it. Seems weird. Right?

How does that work?

You’ve got:

  • Milk

  • Sugar

  • Tea leaves

  • Ginger

  • Water

If I want to describe any chai you make, I don’t have to taste it — I can just write down how much of each ingredient you used:

  • Light chai => [100 ml water, 200 ml milk, 1 spoon sugar, 1 spoon tea leaves, 0 ginger]

  • Kadak adrak chai => [150 ml water, 150 ml milk, 1 sugar, 3 spoons tea leaves, 1 spoon ginger]

That list of numbers is the embedding — it’s chai turned into math.

Now, if two chai recipes have very similar numbers, they’ll probably taste alike.

And that’s exactly how AI “tastes” things without tasting — it compares numbers instead of flavors.

But how does that work?

Imagine all the chai that you have made are placed on a map. There is a cluster where kadak chai are close to each other and there is a cluster in which light chai are close.

In AI, each “chai recipe” list — like [100, 200, 1, 1, 0] — is a vector (just a fancy word for an ordered list of numbers).
When we have lots of these, we can:

  1. Measure distances between them to find similarities.

  2. Cluster them so we know which groups are alike.

The act of turning your chai into that list of numbers is called embedding — we are embedding the essence of the chai into math.

The AI doesn’t know what milk or ginger taste like — it just knows the numbers. And from those numbers, it can say,

This chai is 90% similar to that one.

And this is how Vector embeddings work.

0
Subscribe to my newsletter

Read articles from Parth Tuteja directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Parth Tuteja
Parth Tuteja