Build a Multi-Agent System with LangGraph: A Complete Tutorial on Agent Orchestration


Introduction
In the last three blogs in our Ultimate Langraph Tutorial Series, we highlighted different components of LangGraph for beginners, Long-term Memory Support, and building an AI agent with custom tools support. After implementing these systems for various enterprise clients, we at Futuresmart AI, have observed that as systems grow, they can become complex and hard to manage.
To solve this, we break down the application into smaller, independent agents - a pattern we've successfully deployed in production environments for multiple clients. This approach offers modularity, specialization, and control. We can work on individual agents separately, create expert agents, and control how they communicate. Having built numerous such systems in FutureSmart AI, we've seen firsthand how this architecture significantly improves maintainability and scalability.
In this blog post, we will learn to create a complete Multi-Agent System from scratch using LangGraph. Here, there will be one Supervisor Agent which can communicate with other specialized agents. Each agent has their own set of tools, mirroring how we at Futuresmart AI structure enterprise-grade AI solutions.
Setting up the Environment
To get started, you need to install the below dependencies
%%capture --no-stderr
%pip install -U langgraph langchain_community langchain_openai langchain_experimental langchain-chroma pypdf sentence-transformers
Setting up API Keys
Before diving into building your LangGraph AI agents, it's crucial to set up your API keys. These keys allow your agent to interact with external tools like Tavily Search and OpenAI GPT models securely. Without them, the tools cannot function effectively.
Here we are using the OpenAI model, but you can use any LLM of your choice.
import getpass
import os
def _set_env(var: str):
if not os.environ.get(var):
os.environ[var] = getpass.getpass(f"{var}: ")
_set_env("TAVILY_API_KEY")
_set_env("OPENAI_API_KEY")
Creating the LLM Object
Hereβs how to initialize the LLM using LangChain ChatOpenAI
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(model_name="gpt-4o")
Create Tools
In this blog, we will make three agents. One agent can do web research with a search engine tool. One agent can retrieve documents and answers based on that. And one agent can query the SQL database.
Now, in our previous LangGraph blog, we have already shown how to create a WebSearch Tool, a Retriever Tool, and an NL2SQL Tool. You can follow those steps to create tools for this tutorial.
Now, once you have the tools ready, we are ready to proceed with creating the AI Agents.
Creating the Supervisor Agent
The supervisor agent is responsible for managing the conversation flow between different specialized agents. It decides which agent should handle the current request and when the task is complete. Let's see how to implement this:
from typing import Literal
from typing_extensions import TypedDict
from langgraph.graph import MessagesState, START, END
from langgraph.types import Command
# Define available agents
members = ["web_researcher", "rag", "nl2sql"]
# Add FINISH as an option for task completion
options = members + ["FINISH"]
# Create system prompt for supervisor
system_prompt = (
"You are a supervisor tasked with managing a conversation between the"
f" following workers: {members}. Given the following user request,"
" respond with the worker to act next. Each worker will perform a"
" task and respond with their results and status. When finished,"
" respond with FINISH."
)
# Define router type for structured output
class Router(TypedDict):
"""Worker to route to next. If no workers needed, route to FINISH."""
next: Literal["web_researcher", "rag", "nl2sql", "FINISH"]
# Create supervisor node function
def supervisor_node(state: MessagesState) -> Command[Literal["web_researcher", "rag", "nl2sql", "__end__"]]:
messages = [
{"role": "system", "content": system_prompt},
] + state["messages"]
response = llm.with_structured_output(Router).invoke(messages)
goto = response["next"]
print(f"Next Worker: {goto}")
if goto == "FINISH":
goto = END
return Command(goto=goto)
The supervisor agent works by:
Taking the current conversation state as input
Using the system prompt to understand its role
Making a decision about which agent should act next
Returning a command that directs the flow to the chosen agent
Implementing Individual Agents
Now, let's create our specialized agents using a custom function that creates individual LangGraph graphs for each agent. This approach allows each agent to have its own workflow while maintaining consistency in how they process requests and use tools.
First, let's define our custom create_agent
function:
class AgentState(TypedDict):
"""The state of the agent."""
messages: Annotated[Sequence[BaseMessage], add_messages]
def create_agent(llm, tools):
llm_with_tools = llm.bind_tools(tools)
def chatbot(state: AgentState):
return {"messages": [llm_with_tools.invoke(state["messages"])]}
graph_builder = StateGraph(AgentState)
graph_builder.add_node("agent", chatbot)
tool_node = ToolNode(tools=tools)
graph_builder.add_node("tools", tool_node)
graph_builder.add_conditional_edges(
"agent",
tools_condition,
)
graph_builder.add_edge("tools", "agent")
graph_builder.set_entry_point("agent")
return graph_builder.compile()
This custom function:
Takes an LLM and a list of tools as input
Creates a new StateGraph for the agent
Sets up the necessary nodes for the agent and tool execution
Configures the flow between nodes
Returns a compiled graph that represents the agent's complete workflow
Now, let's create our specialized agents using this function:
Web Research Agent
websearch_agent = create_agent(llm, [web_search_tool])
def web_research_node(state: MessagesState) -> Command[Literal["supervisor"]]:
result = websearch_agent.invoke(state)
return Command(
update={
"messages": [
HumanMessage(content=result["messages"][-1].content, name="web_researcher")
]
},
goto="supervisor",
)
RAG Agent
rag_agent = create_agent(llm, [retriever_tool])
def rag_node(state: MessagesState) -> Command[Literal["supervisor"]]:
result = rag_agent.invoke(state)
return Command(
update={
"messages": [
HumanMessage(content=result["messages"][-1].content, name="rag")
]
},
goto="supervisor",
)
NL2SQL Agent
nl2sql_agent = create_agent(llm, [nl2sql_tool])
def nl2sql_node(state: MessagesState) -> Command[Literal["supervisor"]]:
result = nl2sql_agent.invoke(state)
return Command(
update={
"messages": [
HumanMessage(content=result["messages"][-1].content, name="nl2sql")
]
},
goto="supervisor",
)
Putting It All Together
Finally, let's create the main graph that connects all our agents:
builder = StateGraph(MessagesState)
builder.add_edge(START, "supervisor")
builder.add_node("supervisor", supervisor_node)
builder.add_node("web_researcher", web_research_node)
builder.add_node("rag", rag_node)
builder.add_node("nl2sql", nl2sql_node)
graph = builder.compile()
This creates a complete multi-agent system where:
The supervisor receives the initial request
It routes the request to the appropriate specialized agent
The specialized agent processes the request using its tools
Control returns to the supervisor to decide the next step
The process continues until the supervisor determines the task is complete
Visualizing the LangGraph
from IPython.display import Image, display
try:
display(Image(graph.get_graph().draw_mermaid_png()))
except Exception:
# You can put your exception handling code here
pass
Testing the System
Let's test our multi-agent system with a couple of examples:
# Example: Complex Query Using Multiple Agents
input_question = "Find the founder of FutureSmart AI and then do a web research on him"
for s in graph.stream(
{"messages": [("user", input_question)]},
subgraphs=True
):
print(s)
print("----")
Output:
Next Worker: rag
((), {'supervisor': None})
----
INSIDE RETRIEVER NODE
(('rag:7c5458df-0abd-944a-27f7-b0bad49ccf3d',), {'agent': {'messages': [AIMessage(content='', additional_kwargs={'tool_calls': [{'id': 'call_fK9lMHGrtubenQ697xpd2ZZ2', 'function': {'arguments': '{"question":"Who is the founder of FutureSmart AI?"}', 'name': 'retriever_tool'}, 'type': 'function'}], 'refusal': None}, response_metadata={'token_usage': {'completion_tokens': 25, 'prompt_tokens': 70, 'total_tokens': 95, 'completion_tokens_details': {'accepted_prediction_tokens': 0, 'audio_tokens': 0, 'reasoning_tokens': 0, 'rejected_prediction_tokens': 0}, 'prompt_tokens_details': {'audio_tokens': 0, 'cached_tokens': 0}}, 'model_name': 'gpt-4o-2024-08-06', 'system_fingerprint': 'fp_d28bcae782', 'finish_reason': 'tool_calls', 'logprobs': None}, id='run-e538251e-24e9-45ac-a5b7-b4ce111615ad-0', tool_calls=[{'name': 'retriever_tool', 'args': {'question': 'Who is the founder of FutureSmart AI?'}, 'id': 'call_fK9lMHGrtubenQ697xpd2ZZ2', 'type': 'tool_call'}], usage_metadata={'input_tokens': 70, 'output_tokens': 25, 'total_tokens': 95, 'input_token_details': {'audio': 0, 'cache_read': 0}, 'output_token_details': {'audio': 0, 'reasoning': 0}})]}})
----
(('rag:7c5458df-0abd-944a-27f7-b0bad49ccf3d',), {'tools': {'messages': [ToolMessage(content='FutureSmart AI provides customized speech to text services, employing cutting-\nedge speech recognition technologies to cater to specific client needs. Ideal for \ncreating efficient documentation and enabling voice-driven commands, this \nsolution boosts productivity and accessibility.\n\nFutureSmart AI provides custom Natural Language Processing (NLP) \nsolutions for companies looking to get ahead of the future. Our \ndedicated team of Data Scientists and ML Engineers provides an end-\nto-end solution from data labeling to modeling and deploying an ML \nmodel tailored to your specific use case. \nFounder: Pradip Nichite \n \nServices: \nText Classification \nAt FutureSmart AI, we develop custom text classification solutions using \nadvanced NLP techniques tailored to your specific business requirements. \nLeveraging Python, Pytorch, and Hugging Face transformers, we enable precise \ndata categorization across applications such as intent detection, document \ncategorization, and sentiment analysis, enhancing your decision-making \nprocesses and operational efficiency. \n \nChatbots \nWe specialize in creating custom chatbots that integrate seamlessly with your \nbusiness environment. Using semantic search and large language models, our', name='retriever_tool', id='fe12dcaa-a380-437f-8c24-5a7cbf6ab031', tool_call_id='call_fK9lMHGrtubenQ697xpd2ZZ2')]}})
----
(('rag:7c5458df-0abd-944a-27f7-b0bad49ccf3d',), {'agent': {'messages': [AIMessage(content='', additional_kwargs={'tool_calls': [{'id': 'call_nvmRMsfWcg0YVC9xeTqxZO7z', 'function': {'arguments': '{"question": "Who is Pradip Nichite?"}', 'name': 'retriever_tool'}, 'type': 'function'}, {'id': 'call_IGzCvWkpkzlpwlhFR1MR80U4', 'function': {'arguments': '{"question": "What is the professional background of Pradip Nichite?"}', 'name': 'retriever_tool'}, 'type': 'function'}], 'refusal': None}, response_metadata={'token_usage': {'completion_tokens': 67, 'prompt_tokens': 322, 'total_tokens': 389, 'completion_tokens_details': {'accepted_prediction_tokens': 0, 'audio_tokens': 0, 'reasoning_tokens': 0, 'rejected_prediction_tokens': 0}, 'prompt_tokens_details': {'audio_tokens': 0, 'cached_tokens': 0}}, 'model_name': 'gpt-4o-2024-08-06', 'system_fingerprint': 'fp_d28bcae782', 'finish_reason': 'tool_calls', 'logprobs': None}, id='run-a4c6ec42-3aec-449c-ab39-ac029109eaad-0', tool_calls=[{'name': 'retriever_tool', 'args': {'question': 'Who is Pradip Nichite?'}, 'id': 'call_nvmRMsfWcg0YVC9xeTqxZO7z', 'type': 'tool_call'}, {'name': 'retriever_tool', 'args': {'question': 'What is the professional background of Pradip Nichite?'}, 'id': 'call_IGzCvWkpkzlpwlhFR1MR80U4', 'type': 'tool_call'}], usage_metadata={'input_tokens': 322, 'output_tokens': 67, 'total_tokens': 389, 'input_token_details': {'audio': 0, 'cache_read': 0}, 'output_token_details': {'audio': 0, 'reasoning': 0}})]}})
----
INSIDE RETRIEVER NODE
INSIDE RETRIEVER NODE
(('rag:7c5458df-0abd-944a-27f7-b0bad49ccf3d',), {'tools': {'messages': [ToolMessage(content='FutureSmart AI provides customized speech to text services, employing cutting-\nedge speech recognition technologies to cater to specific client needs. Ideal for \ncreating efficient documentation and enabling voice-driven commands, this \nsolution boosts productivity and accessibility.\n\nFutureSmart AI provides custom Natural Language Processing (NLP) \nsolutions for companies looking to get ahead of the future. Our \ndedicated team of Data Scientists and ML Engineers provides an end-\nto-end solution from data labeling to modeling and deploying an ML \nmodel tailored to your specific use case. \nFounder: Pradip Nichite \n \nServices: \nText Classification \nAt FutureSmart AI, we develop custom text classification solutions using \nadvanced NLP techniques tailored to your specific business requirements. \nLeveraging Python, Pytorch, and Hugging Face transformers, we enable precise \ndata categorization across applications such as intent detection, document \ncategorization, and sentiment analysis, enhancing your decision-making \nprocesses and operational efficiency. \n \nChatbots \nWe specialize in creating custom chatbots that integrate seamlessly with your \nbusiness environment. Using semantic search and large language models, our', name='retriever_tool', id='57d1c6f3-b789-4ae1-84c4-c156ca34d3c1', tool_call_id='call_nvmRMsfWcg0YVC9xeTqxZO7z'), ToolMessage(content='FutureSmart AI provides customized speech to text services, employing cutting-\nedge speech recognition technologies to cater to specific client needs. Ideal for \ncreating efficient documentation and enabling voice-driven commands, this \nsolution boosts productivity and accessibility.\n\nFutureSmart AI provides custom Natural Language Processing (NLP) \nsolutions for companies looking to get ahead of the future. Our \ndedicated team of Data Scientists and ML Engineers provides an end-\nto-end solution from data labeling to modeling and deploying an ML \nmodel tailored to your specific use case. \nFounder: Pradip Nichite \n \nServices: \nText Classification \nAt FutureSmart AI, we develop custom text classification solutions using \nadvanced NLP techniques tailored to your specific business requirements. \nLeveraging Python, Pytorch, and Hugging Face transformers, we enable precise \ndata categorization across applications such as intent detection, document \ncategorization, and sentiment analysis, enhancing your decision-making \nprocesses and operational efficiency. \n \nChatbots \nWe specialize in creating custom chatbots that integrate seamlessly with your \nbusiness environment. Using semantic search and large language models, our', name='retriever_tool', id='a498df53-f77a-4bfb-abbf-9153790295e5', tool_call_id='call_IGzCvWkpkzlpwlhFR1MR80U4')]}})
----
(('rag:7c5458df-0abd-944a-27f7-b0bad49ccf3d',), {'agent': {'messages': [AIMessage(content="The founder of FutureSmart AI is Pradip Nichite. Unfortunately, the current retrieval did not provide additional information specifically about Pradip Nichite's professional background or further personal details. For more comprehensive insights, you might consider conducting a more extensive web search or accessing professional networking sites like LinkedIn.", additional_kwargs={'refusal': None}, response_metadata={'token_usage': {'completion_tokens': 63, 'prompt_tokens': 888, 'total_tokens': 951, 'completion_tokens_details': {'accepted_prediction_tokens': 0, 'audio_tokens': 0, 'reasoning_tokens': 0, 'rejected_prediction_tokens': 0}, 'prompt_tokens_details': {'audio_tokens': 0, 'cached_tokens': 0}}, 'model_name': 'gpt-4o-2024-08-06', 'system_fingerprint': 'fp_d28bcae782', 'finish_reason': 'stop', 'logprobs': None}, id='run-bc47c29c-0693-41c5-88be-b322e1fbb096-0', usage_metadata={'input_tokens': 888, 'output_tokens': 63, 'total_tokens': 951, 'input_token_details': {'audio': 0, 'cache_read': 0}, 'output_token_details': {'audio': 0, 'reasoning': 0}})]}})
----
((), {'rag': {'messages': [HumanMessage(content="The founder of FutureSmart AI is Pradip Nichite. Unfortunately, the current retrieval did not provide additional information specifically about Pradip Nichite's professional background or further personal details. For more comprehensive insights, you might consider conducting a more extensive web search or accessing professional networking sites like LinkedIn.", additional_kwargs={}, response_metadata={}, name='rag')]}})
----
Next Worker: web_researcher
((), {'supervisor': None})
----
(('web_researcher:509bb5e2-bf9e-2c1d-5c65-978a73d5e94c',), {'agent': {'messages': [AIMessage(content='', additional_kwargs={'tool_calls': [{'id': 'call_L5d4KhCSPsT5HmHTpHmnyryx', 'function': {'arguments': '{"query":"Pradip Nichite"}', 'name': 'tavily_search_results_json'}, 'type': 'function'}], 'refusal': None}, response_metadata={'token_usage': {'completion_tokens': 23, 'prompt_tokens': 161, 'total_tokens': 184, 'completion_tokens_details': {'accepted_prediction_tokens': 0, 'audio_tokens': 0, 'reasoning_tokens': 0, 'rejected_prediction_tokens': 0}, 'prompt_tokens_details': {'audio_tokens': 0, 'cached_tokens': 0}}, 'model_name': 'gpt-4o-2024-08-06', 'system_fingerprint': 'fp_e161c81bbd', 'finish_reason': 'tool_calls', 'logprobs': None}, id='run-35518542-a3b6-424b-b4c7-f8fbb56cffd6-0', tool_calls=[{'name': 'tavily_search_results_json', 'args': {'query': 'Pradip Nichite'}, 'id': 'call_L5d4KhCSPsT5HmHTpHmnyryx', 'type': 'tool_call'}], usage_metadata={'input_tokens': 161, 'output_tokens': 23, 'total_tokens': 184, 'input_token_details': {'audio': 0, 'cache_read': 0}, 'output_token_details': {'audio': 0, 'reasoning': 0}})]}})
----
(('web_researcher:509bb5e2-bf9e-2c1d-5c65-978a73d5e94c',), {'tools': {'messages': [ToolMessage(content='[{"url": "https://www.youtube.com/c/PradipNichiteAI", "content": "Hello, my name is Pradip Nichite. I am a π Top Rated Plus Data Science Freelancer with 8+ years of experience, specializing in NLP and Back-End Development. Founder of FutureSmart AI, helping"}, {"url": "https://www.youtube.com/channel/UCwpCmuWq_NPVLNyr8z1IGGQ", "content": "I\'m Pradip Nichite, a Top Rated Plus freelance Data Scientist on Upwork πΌ, a successful digital nomad π, and an entrepreneur. My journey in freelancing has led me to earn over $200K π°"}]', name='tavily_search_results_json', id='5daeafe8-e673-425e-9d7e-35f49ccae710', tool_call_id='call_L5d4KhCSPsT5HmHTpHmnyryx', artifact={'query': 'Pradip Nichite', 'follow_up_questions': None, 'answer': None, 'images': [], 'results': [{'title': 'Pradip Nichite - YouTube', 'url': 'https://www.youtube.com/c/PradipNichiteAI', 'content': 'Hello, my name is Pradip Nichite. I am a π Top Rated Plus Data Science Freelancer with 8+ years of experience, specializing in NLP and Back-End Development. Founder of FutureSmart AI, helping', 'score': 0.8080827, 'raw_content': None}, {'title': 'Pradip Nichite - YouTube', 'url': 'https://www.youtube.com/channel/UCwpCmuWq_NPVLNyr8z1IGGQ', 'content': "I'm Pradip Nichite, a Top Rated Plus freelance Data Scientist on Upwork πΌ, a successful digital nomad π, and an entrepreneur. My journey in freelancing has led me to earn over $200K π°", 'score': 0.7636429, 'raw_content': None}], 'response_time': 1.73})]}})
----
(('web_researcher:509bb5e2-bf9e-2c1d-5c65-978a73d5e94c',), {'agent': {'messages': [AIMessage(content='Pradip Nichite is a Top Rated Plus Data Science Freelancer with over 8 years of experience, specializing in Natural Language Processing (NLP) and Back-End Development. He is the founder of FutureSmart AI. Additionally, Pradip is recognized as a successful digital nomad and entrepreneur, having earned over $200K through freelancing, primarily on platforms like Upwork. For more insights, you can explore his [YouTube channel](https://www.youtube.com/c/PradipNichiteAI), where he shares more about his experiences and expertise.', additional_kwargs={'refusal': None}, response_metadata={'token_usage': {'completion_tokens': 116, 'prompt_tokens': 346, 'total_tokens': 462, 'completion_tokens_details': {'accepted_prediction_tokens': 0, 'audio_tokens': 0, 'reasoning_tokens': 0, 'rejected_prediction_tokens': 0}, 'prompt_tokens_details': {'audio_tokens': 0, 'cached_tokens': 0}}, 'model_name': 'gpt-4o-2024-08-06', 'system_fingerprint': 'fp_e161c81bbd', 'finish_reason': 'stop', 'logprobs': None}, id='run-32d923b8-a3be-420a-a06f-35a7e27c68bb-0', usage_metadata={'input_tokens': 346, 'output_tokens': 116, 'total_tokens': 462, 'input_token_details': {'audio': 0, 'cache_read': 0}, 'output_token_details': {'audio': 0, 'reasoning': 0}})]}})
----
((), {'web_researcher': {'messages': [HumanMessage(content='Pradip Nichite is a Top Rated Plus Data Science Freelancer with over 8 years of experience, specializing in Natural Language Processing (NLP) and Back-End Development. He is the founder of FutureSmart AI. Additionally, Pradip is recognized as a successful digital nomad and entrepreneur, having earned over $200K through freelancing, primarily on platforms like Upwork. For more insights, you can explore his [YouTube channel](https://www.youtube.com/c/PradipNichiteAI), where he shares more about his experiences and expertise.', additional_kwargs={}, response_metadata={}, name='web_researcher')]}})
----
Next Worker: FINISH
((), {'supervisor': None})
----
Do not confuse by seeing {'supervisor': None} . This issue has been faced by others also. The multi-agent system will still work as intended.
Things to Improve
Add memory support to allow agents to remember previous interactions and maintain context
Improved the system prompt of the supervisor agent for better decision-making
Add more relevant tools to each individual agent
Conclusion
In this tutorial, we've built a sophisticated multi-agent system using LangGraph that demonstrates how to create specialized agents working together under centralized supervision. The system showcases advanced concepts in agent orchestration, tool integration, and state management - areas where we at Futuresmart AI, have successfully delivered numerous enterprise solutions.
This architecture provides several advantages:
Modularity: Each agent has a specific role and can be modified independently
Scalability: New agents can be added without changing existing ones
Flexibility: The supervisor can dynamically choose the best agent for each task
Control: The workflow is clearly defined and manageable
If you found this guide useful and want to explore more, then you should definitely check our YouTube video on the new multi-agent framework from OpenAI (Swarm)
π More Learning Resources
If you prefer visual learning, check out these step-by-step tutorials:
Build RAG Applications from Scratch Without LangChain or LlamaIndex β Learn to build RAG applications without relying on frameworks for better customization and debugging.
LangChain RAG Course: From Basics to a Production-Ready RAG Chatbot β A comprehensive guide on implementing RAG using LangChain, covering everything from basics to production deployment.
Mastering Natural Language to SQL with LangChain and LangSmith | NL2SQL β A hands-on tutorial on converting natural language queries into SQL using LangChain and LangSmith.
At FutureSmart AI, we help businesses develop state-of-the-art AI solutions tailored to their needs. We've implemented similar multi-agent architectures for various industries, from customer service automation to complex AI Interview systems.
If you have inquiries, feel free to reach out to us at contact@futuresmart.ai. For real-world examples of our work, take a look at our case studies, where we showcase how our expertise in LangGraph and other AI technologies has delivered measurable business value.
Stay tuned for the next tutorial in this series, where we'll explore more advanced patterns and optimizations in multi-agent systems.
Resources
Subscribe to my newsletter
Read articles from Rounak Show directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Rounak Show
Rounak Show
Learning Data Science and Sharing the journey through Hashnode.