If you are developing an AI agent or a simple rag app. You must have faced the issue where the LLM spits out random made-up stuff that is not true and not relevant to the context that you providedThis is called the infamous “Hallucination” Trait of t...