📈What are Vector Embeddings?

Bhavya JainBhavya Jain
1 min read

In this blog, we will discuss how Vector Embeddings work in capturing semantic meaning and relationships.

These multi-dimensional numerical vectors are then used by machine learning algorithms to perform tasks such as similarity searches, clustering, and classification by analyzing the proximity of these vectors in a continuous space.

These are arranged in a 3-dimensional plane, and all the points are the words or tokens which are arranged on the basis of their category groups.

But how do we explain this? Let’s assume we want to explain this to our mom 🤶, so we will plot the points like this:

  • Top-right: Chakla, Belan, Broom, Pocha → daily tools.

  • Top-left: Kadhai, Chammach, Fridge, Microwave, Mixer, Grinder → kitchen essentials.

  • Bottom-right: Saree, Suit, Bucket, Mug → clothes and washing-related.

  • Bottom-left: Sofa, Bed → furniture.

This is represented in 2D, but it is represented in 3D with huge amounts of data, and all are connected in parallel to make sense in the output.

😊 Stay tuned for more talks about AI…

0
Subscribe to my newsletter

Read articles from Bhavya Jain directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Bhavya Jain
Bhavya Jain