Deep Integration: Building Multi-Agent Workflows with CrewAI & LangDB AI Gateway

Mrunmay ShelarMrunmay Shelar
7 min read

In the world of AI agent development, the challenge isn't just building individual agents—it's creating systems where agents can seamlessly collaborate, access the right tools, and leverage the best models for each task. What if you could orchestrate complex multi-agent workflows with built-in model access, dynamic tool management, and complete observability?

In this post, we'll show you how CrewAI and LangDB work together to create a deeply integrated agent development platform.

TL;DR:

This guide demonstrates the deep integration between CrewAI and the LangDB AI Gateway, showing how to build sophisticated multi-agent workflows with seamless model access, dynamic tool management, and built-in observability. We'll use a report generation system as a practical example to showcase how these technologies work together as a unified platform, giving you access to over 350+ LLMs.

Final LangDB Thread view for the CrewAI workflow

Full Conversation: https://app.langdb.ai/sharing/threads/3becbfed-a1be-ae84-ea3c-4942867a3e22

This team of agents collaborates to deliver comprehensive reports by combining web research for current information with analysis and professional writing. You can see a full trace of the entire workflow execution.

The Code

You can find the complete source code for this project on GitHub:

The Integration: CrewAI + LangDB AI Gateway as a Unified Platform

Our system demonstrates how CrewAI and the LangDB AI Gateway work together as a deeply integrated platform:

  • CrewAI: Provides the orchestration framework for multi-agent workflows

  • LangDB AI Gateway: Delivers the AI Gateway capabilities including access to over 350+ models, model management, tool management, and observability

  • Unified Experience: Both technologies work seamlessly together, appearing as a single development platform

This integration enables you to build sophisticated agent systems without worrying about the underlying infrastructure complexity.

Checkout: https://docs.langdb.ai/guides/building-agents/building-reporting-writing-agent-using-crewai and https://docs.langdb.ai/features/tracing

Seamless Integration Setup

The integration between CrewAI and the LangDB AI Gateway is designed to be as simple as possible. With just a few lines of code, you get access to over 350 models, dynamic tooling, and complete observability.

Initialize the Integration

The first step is to initialize the LangDB integration with CrewAI. This single call enables all the advanced features:

# main.py
from pylangdb.crewai import init
from dotenv import load_dotenv

# Load environment variables and initialize LangDB integration
load_dotenv()
init()  # This enables deep integration with the LangDB AI Gateway

Configure the LangDB AI Gateway

Set up your LangDB AI Gateway credentials to enable model access and tool management:

export LANGDB_API_KEY="<your_langdb_api_key>"
export LANGDB_PROJECT_ID="<your_langdb_project_id>"
export LANGDB_API_BASE_URL='https://api.us-east-1.langdb.ai'

Create LangDB AI Gateway-Enabled LLMs

Define a helper function that creates LLMs with full LangDB AI Gateway integration:

from crewai import LLM
import os

def create_llm(model):
    return LLM(
        model=model,
        api_key=os.environ.get("LANGDB_API_KEY"),
        base_url=os.environ.get("LANGDB_API_BASE_URL"),
        extra_headers={
            "x-project-id": os.environ.get("LANGDB_PROJECT_ID")
        }
    )

Deep Integration Features

Model Selection across 350+ Models

The LangDB AI Gateway's model access capabilities are fully integrated with CrewAI. You can specify any model from over 350 providers, and the LangDB AI Gateway will provide access to it:

# Each agent can use different models seamlessly
@agent
def researcher(self) -> Agent:
    return Agent(
        config=self.agents_config['researcher'],
        llm=create_llm("openai/langdb/reportresearcher_9wzgx5n5") # Virtual Model with tools
    )

@agent
def analyst(self) -> Agent:
    return Agent(
        config=self.agents_config['analyst'],
        llm=create_llm("openai/anthropic/claude-3.7-sonnet") # Direct model access
    )

@agent
def report_writer(self) -> Agent:
    return Agent(
        config=self.agents_config['report_writer'],
        llm=create_llm("openai/gpt-4o") # Another model provider
    )

Dynamic Tool Management

The LangDB AI Gateway's Virtual Models and Virtual MCPs integrate seamlessly with CrewAI agents. Tools are managed centrally in the LangDB AI Gateway but appear natively to your CrewAI agents:

# The researcher agent automatically gets access to web search tools
# through the LangDB Virtual Model, without any additional configuration
@agent
def researcher(self) -> Agent:
    return Agent(
        config=self.agents_config['researcher'],
        llm=create_llm("openai/langdb/reportresearcher_9wzgx5n5") # Tools included automatically
    )

Built-in Observability

Every interaction is automatically traced and observable through the LangDB AI Gateway's integrated tracing system:

# No additional tracing code needed - it's all automatic
def generate_report(topic):
    crew_instance = ReportGenerationCrew()
    result = crew_instance.crew().kickoff()
    return result  # Full trace automatically captured in the LangDB AI Gateway

Screenshot of interface displaying a task management dashboard. It shows a list of processes with their durations, a visual timeline of task execution, and detailed metadata about a specific run, including trace and thread IDs and timestamps. The background is dark, and the layout is divided into sections for easy navigation.

Advanced Integration Capabilities

Virtual Model Integration

LangDB AI Gateway Virtual Models work seamlessly with CrewAI agents. You can create models with specific capabilities and use them directly:

# This agent automatically gets web search capabilities
# through the LangDB Virtual Model configuration
@agent
def researcher(self) -> Agent:
    return Agent(
        config=self.agents_config['researcher'],
        llm=create_llm("openai/langdb/reportresearcher_9wzgx5n5")
    )

The Virtual Model is configured in the LangDB AI Gateway UI to include:

  • Base model (e.g., GPT-4.1)

  • Attached MCP tools (e.g., Tavily Search)

  • Custom instructions and parameters

LangDB UI showing the Virtual Model configuration with an attached MCP Server

MCP Tool Integration

The LangDB AI Gateway's MCP (Model Context Protocol) tools integrate with Virtual Models:

Setting Up Virtual MCP Server With Virtual Model

This means:

  • No Tool Configuration: Tools are managed in the LangDB AI Gateway UI

  • Dynamic Updates: Change tools without redeploying agents

  • Automatic Tracing: All tool calls are traced in the LangDB AI Gateway

  • Cost Tracking: Tool usage costs are tracked automatically

Multi-Provider Model Access

The LangDB AI Gateway's model access capabilities are fully integrated with CrewAI, giving you access to a universe of over 350+ models:

# Each agent can use different models seamlessly
@agent
def researcher(self) -> Agent:
    return Agent(
        config=self.agents_config['researcher'],
        llm=create_llm("openai/langdb/reportresearcher_9wzgx5n5") # OpenAI + tools
    )

@agent
def analyst(self) -> Agent:
    return Agent(
        config=self.agents_config['analyst'],
        llm=create_llm("openai/anthropic/claude-3.7-sonnet") # Anthropic
    )

@agent
def report_writer(self) -> Agent:
    return Agent(
        config=self.agents_config['report_writer'],
        llm=create_llm("openai/gpt-4o") # OpenAI
    )

Integrated Workflow Management

Crew Definition with LangDB AI Gateway Integration

The crew definition remains simple while leveraging all LangDB AI Gateway capabilities:

@crew
def crew(self) -> Crew:
    return Crew(
        agents=[self.researcher(), self.analyst(), self.report_writer()],
        tasks=[self.research_task(), self.analysis_task(), self.report_writing_task()],
        process=Process.sequential
    )

Task Execution with Built-in Observability

Tasks execute with full LangDB AI Gateway integration:

def generate_report(topic):
    crew_instance = ReportGenerationCrew()
    result = crew_instance.crew().kickoff()
    return result  # Full trace automatically available in the LangDB AI Gateway

Integration Benefits

Simplified Development

  • Single Setup: One initialization call enables all features

  • No Tool Management: Tools are managed centrally in the LangDB AI Gateway

  • Automatic Tracing: No additional observability code needed

Dynamic Capabilities

  • Model Switching: Change models from over 350+ options without code changes

  • Tool Updates: Add/remove tools through the LangDB AI Gateway UI

  • Model Flexibility: Mix and match Model seamlessly

Production Ready

  • Built-in Observability: Complete traces for every execution

  • Cost Tracking: Automatic cost and usage monitoring

  • Performance Monitoring: Latency and performance metrics

  • Error Handling: Integrated error tracking and debugging

Running the Integrated System

Execute the workflow with full LangDB integration:

if __name__ == "__main__":
    generate_report("The Impact of AI on Social Media Marketing in 2024")

The system automatically:

  • Provides access to appropriate models from over 350+ choices

  • Manages tool access

  • Captures complete traces

  • Tracks costs and performance

Real-World Integration Example

Here's what the integration looks like in practice:

# The Impact of AI on Social Media Marketing in 2024

## Executive Summary

Artificial Intelligence has fundamentally transformed social media marketing in 2024, creating new opportunities and challenges for businesses worldwide. This report examines the current state of AI integration in social media marketing, key trends, and strategic implications for marketers.

## Key Findings

### 1. AI-Powered Content Creation
- **Automated Content Generation**: 73% of marketers now use AI tools for content creation
- **Personalization at Scale**: AI enables hyper-personalized content delivery to specific audience segments
- **Real-time Optimization**: Dynamic content adjustment based on performance metrics

### 2. Advanced Analytics and Insights
- **Predictive Analytics**: AI models forecast campaign performance with 85% accuracy
- **Sentiment Analysis**: Real-time brand sentiment monitoring across platforms
- **Competitive Intelligence**: Automated tracking of competitor strategies and performance

Every step of this report generation was powered by the deep integration between CrewAI and LangDB, with complete observability into the process.

Conclusion

The integration between CrewAI and the LangDB AI Gateway creates a unified platform for building sophisticated multi-agent systems. This deep integration provides:

  • Seamless Development: Build complex workflows with simple, clean code

  • Dynamic Capabilities: Change models and tools without redeployment

  • Built-in Observability: Complete visibility into every aspect of your workflows

  • Production Ready: Enterprise-grade monitoring and management

This architecture enables rapid development and iteration, allowing you to build truly powerful and intelligent agentic systems for any domain.

Ready to build your own? Start building for free on the LangDB AI Gateway or Explore CrewAI.

0
Subscribe to my newsletter

Read articles from Mrunmay Shelar directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Mrunmay Shelar
Mrunmay Shelar