How to Build an Epic Conversational AI with LangDB.ai and Rasa-pro (and Get $10 Free Credit!)


In the ever-evolving landscape of AI analytical innovations, building a next-level chatbot is more than just coding, it’s about creating an ecosystem of AI integration solutions that drive enterprise AI governance, scaling AI, and superior AI observability. Today, we’re diving deep into how you can merge the formidable powers of LangDB.ai and Rasa-pro to create a conversational AI system that’s both smart and scalable—all while enjoying a $10 free credit to kickstart your journey for free!
Alternatively you can also follow our Youtube tutorial
What’s the Buzz About LangDB.ai?
LangDB.ai is a powerful AI platform designed to help enterprises securely govern, optimize, and scale their AI solutions. As an AI gateway, it provides a seamless connection with over 150 large language models (LLMs) using OpenAI-compatible APIs.
By offering AI governance, cost efficiency, and enterprise AI solutions, LangDB ensures businesses can deploy AI models with enhanced security, performance, and reliability.
With LangDB, organizations benefit from AI as a service, enabling smooth AI for business intelligence operations while optimizing resources through intelligent model routing and observability. Whether you're building AI software solutions or integrating AI for enterprises, LangDB.ai is the best artificial intelligence platform to streamline AI deployments at scale.
What is Rasa?
Rasa is a leading open-source framework for building conversational AI solutions. It is celebrated for its ability to empower developers with AI governance tools and ai management capabilities that streamline the process of creating engaging, intelligent chatbots. By incorporating Rasa into your workflow, you gain access to advanced policies and pipelines—ideal for enterprise AI governance and AI integration—that help manage and scale AI applications effectively.
What’s on Our Agenda?
Here’s a sneak peek into the roadmap of our ultimate conversational AI guide:
Installation and Setup: Learn how to create a Conda environment and install all the necessary packages.
Building Your Chatbot: Discover how to add engaging flows, configure responses, and fine-tune your Rasa project.
Integrating LangDB.ai: Set up LangDB.ai to bring powerful AI infrastructure tools into your project
Training and Deployment: Train your model and launch your AI server for a live demo
Step-by-Step Guide to Get Started
Installation
Kick off by setting up a dedicated Conda environment to keep your project clean and dependencies in check:
Download and Install Miniconda.
Once installed, follow below steps to install Rasa on your environment
Create your conda environment with
conda create env -n rasa-env python=3.10
- Activate your conda environment
conda activate rasa-env
- Turbocharge Your Setup, Speed up your installation process by installing
uv
:
pip install uv
- Install Rasa-pro Now, install Rasa-pro
uv pip install rasa-pro --extra-index-url=https://europe-west3-python.pkg.dev/rasa-releases/rasa-pro-python/simple/
Setting Up Your Rasa Project
If you’re new to Rasa or need a Rasa-pro license key, no sweat-grab yours from Rasa’s developer portal and set it in your environment:
You may receive a mail from Rasa with a license key for you to use, similar to below screenshot
Open your terminal and activate your conda environment rasa-env
In the same terminal, set your Rasa pro License as following
set RASA_PRO_LICENSE=your-key
Then, initialize your Rasa CALM project:
rasa init --template calm
This command scaffolds your project structure, priming it for some serious AI integration and AI application governance.
Building Your Rasa Chatbot: Nerd Out with Cool Flows and Responses
Adding flows
- Let’s create a flow that greets users with some serious nerd cred. In the
data/flows
folder, create a file calledgreet.yml
:
flows:
greet:
description: always run when the user first greets the bot
name: greet
steps:
- action: utter_greet
- This simple yet effective flow ensures your bot springs into action the moment someone drops a “Hi.”
Crafting Witty Responses
- Next, update the
shared.yml
file in the domain folder to add your greeting response:
version: "3.1"
slots:
return_value:
type: any
mappings:
- type: custom
action: add_contact
- type: custom
action: remove_contact
responses:
utter_greet:
- text: "Hello, how may I help you?"
Configuring Rasa for LangDB.ai Integration
- Integrate your Rasa project with LangDB.ai by updating your
config.yml
:
recipe: default.v1
language: en
pipeline:
- name: SingleStepLLMCommandGenerator
llm:
model_group: openai-gpt-4
policies:
- name: FlowPolicy
- name: IntentlessPolicy
assistant_id: 20250207
Next step is to add this model group inside
endpoints.yml
with our LangDB.ai IntegrationAdd below configuration at the very end of your
endpoints.yml
model_groups:
- id: openai-gpt-4
models:
- provider: openai
model: gpt-4o
api-base: "https://api.us-east-1.langdb.ai/your-project-id/v1"
request_timeout: 7
max_tokens: 256
id: your custom name for model-group(make sure to keep it same as in your config.yml)
model: add your LangDB.ai model name
Remember to replace
your-project-id
with your actual project id to enjoy seamless AI integration and smart AI functionalities.
Follow below steps to fetch Project-id:
Training Your Rasa Model with LangDB.ai
Before you train, set these environment variables to direct all your API calls through LangDB.ai, ensuring robust AI operational visibility:
Use
set
or$env
for Windows andexport
for Mac/Linux machines
OPENAI_API_KEY=your-api-key
OPENAI_BASE_URL=https://api.us-east-1.langdb.ai/your-project-id/v1
RASA_PRO_LICENSE=your-rasa-pro-license-key
Make sure to update your project-id
Why have we used OPENAI_BASE_URL environment variable?
Rasa by default uses Openai API url to send all queries
Since we are using LangDB, we have to overwrite OPENAI_BASE_URL to LangDB’s API url
Everything is set, lets train our model by running
rasa train
Running our Conversational AI
- Time to see your creation in action! Open two terminal windows:
OPENAI_API_KEY=your-api-key
OPENAI_BASE_URL=https://api.us-east-1.langdb.ai/your-project-id/v1
RASA_PRO_LICENSE=your-rasa-pro-license-key
Ensure both terminals have the necessary environment variables
Terminal 1: Actions server by running
rasa run actions
Terminal 2: open Rasa’s interactive chat UI by running
rasa inspect --debug
Watch as your Chat UI lights up with AI integration that delivers unparalleled AI and analytics performance.
Final Thoughts: Nerd Out and Optimize Your AI!
Integrating LangDB.ai with Rasa-pro is more than just building a chatbot-it's about constructing a dynamic, analytical AI ecosystem that brings AI observability, AI governance tools, and end-to-end AI integrations to life. Whether you're diving into ai for managers, tinkering with ai in management, or simply passionate about ai integrations, this guide sets you up for success.
Bonus Alert: Get started for Free! with a $10 free credit to experiment with these cutting-edge tools without any upfront cost! Embrace your inner tech nerd, explore the endless possibilities of AI management tools and AI infrastructure tools, and join the revolution in AI solutions observability.
Don’t forget to check out our source code and starter pack, and if you love what you see, hit that star button on GitHub. Happy coding, and may your AI always be as smart, scalable, and observable as possible!
Subscribe to my newsletter
Read articles from Dishant Gandhi directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
