🚀 My Journey into Agentic AI: Day 1 — Laying the Foundation with NLP

Avadhoot KambleAvadhoot Kamble
3 min read

It’s official — I’ve started my journey into the world of Agentic AI.

Agentic AI is one of the most exciting, fast-evolving fields in AI today, but it can also feel overwhelming. To avoid getting lost in the hype, I decided to start with the basics — the building blocks that make Agentic AI possible.

Through my research, I realized that Generative AI is the foundation, and at the heart of Generative AI lies Natural Language Processing (NLP).

So, Day 1 of my journey was all about understanding NLP.


🔍 What I Explored on Day 1

1. What is NLP and Why Does It Matter?

I began by exploring what NLP (Natural Language Processing) is and where it is used. From auto-complete on our phones to chatbots, recommendation engines, and sentiment analysis — NLP is everywhere. Understanding this convinced me that this is the right foundation for Agentic AI.


2. Tokenization & Preprocessing

Next, I looked at how machines break down human language. Tokenization splits sentences into words, and techniques like stop word removal, stemming, and lemmatization help simplify the text while retaining meaning.

It felt like learning the grammar rules of a new language, but this time for computers.


3. Numericalizing Words

Words alone can’t be fed into a machine, so I studied how they are converted into numbers.

  • One-hot encoding

  • CountVectorizer

  • TF-IDF vectorization

Among these, I found TF-IDF (Term Frequency–Inverse Document Frequency) more reliable because it reduces the importance of overly common words.


4. Word Embeddings

This was the most fascinating part of the day. Word embeddings transform words into vectors that capture meaning and relationships.

I explored Word2Vec, which itself has two methods:

  • CBOW (Continuous Bag-of-Words) — predicts a word based on its context

  • Skip-gram — predicts the context from a word

I also discovered Negative Sampling, a method that improves efficiency in training embeddings.


5. Properties of Word Embeddings

Word embeddings aren’t just numbers; they capture relationships like:

  • Semantic similarity (king – man + woman = queen đź‘‘)

  • Compositionality (phrases built from word meanings)

  • Compactness (smaller vector space yet meaningful)

  • Context adaptation (meanings shift with context)

It amazed me how mathematics and language come together in this way.


6. The Embedding Matrix

Finally, I explored how embeddings are stored in an embedding matrix and how they can be visualized. Seeing clusters of related words together gave me a clear sense of why embeddings are such a powerful tool in AI.


🌱 Wrapping Up Day 1

Day 1 was both challenging and exciting. I started with just a curiosity about Agentic AI, but by the end of the day, I realized how much depth there is even in the “basics.”

This foundation in NLP and word embeddings will serve as the base for everything I build next in my Agentic AI journey.


🙌 Over to You

This is just the beginning. On Day 2, I’ll continue exploring deeper concepts. But I’d love to hear from you:

👉 What do you think I should learn next to strengthen my path toward Agentic AI?
Should I go deeper into embeddings, jump into Transformers, or start experimenting with hands-on Generative AI models?

I’m open to your suggestions! 🚀

0
Subscribe to my newsletter

Read articles from Avadhoot Kamble directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Avadhoot Kamble
Avadhoot Kamble