🤖💬 When the Bot Doesn’t Know… You Do! Human-in-the-Loop AI with LangGraph

“Even the smartest AI sometimes needs a helping hand — yours.”
Artificial Intelligence can write code, diagnose diseases, and even compose poetry — but what happens when it hits a wall?
Enter Human-in-the-Loop (HITL): a powerful design pattern that combines AI automation with human intervention at just the right moment. Today, I learned how to implement HITL using LangGraph, a tool that brings stateful intelligence and decision flows into AI agents.
Let’s explore how I built a smart chatbot that knows when to ask a human for help — using real code, real logic, and some seriously cool concepts. 👇
🧠 What is Human-in-the-Loop (HITL)?
Human-in-the-Loop (HITL) is a concept in AI system design where a human can interrupt, intervene, or validate steps taken by an AI model.
✨ Why HITL?
✅ For decisions that require judgment or ethics
🔍 When the model is unsure or lacks context
🧑⚖️ In high-risk areas like healthcare, law, or finance
With LangGraph, we can tell the AI:
“If you don’t know the answer… pause and ask a human.”
🛠️ How LangGraph Makes It Easy
LangGraph provides an interrupt()
method that halts AI execution mid-flow and waits for human input before continuing.
human_response = interrupt({"query": query})
return human_response["data"]
This creates a checkpoint — AI execution stops, waits for a human response, and resumes once data is provided. Beautifully simple!
🧩 Full Code: Human-in-the-Loop Chatbot with LangGraph
Below is the complete, functional implementation — a chatbot that can escalate queries to a human.
📦 Imports and Setup
# flake8: noqa
from dotenv import load_dotenv
from typing import Annotated
from typing_extensions import TypedDict
from langgraph.graph.message import add_messages
from langchain.chat_models import init_chat_model
from langgraph.prebuilt import ToolNode, tools_condition
from langchain_core.tools import tool
from langgraph.graph import StateGraph, START, END
from langgraph.checkpoint.mongodb import MongoDBSaver
from langgraph.types import interrupt, Command
import json
load_dotenv()
🧰 Define the HITL Tool
@tool
def human_assistance(query: str) -> str:
"""Request assistance from a human."""
human_response = interrupt({"query": query})
return human_response["data"]
✅ This is the heart of the system: if the AI calls this tool, it pauses, sends the query to a human, and waits for the answer.
🧠 Tool Registration and State Setup
tools = [human_assistance]
class State(TypedDict):
messages: Annotated[list, add_messages]
🔗 Model and Tool Binding
llm = init_chat_model(model_provider="openai", model="gpt-4.1")
llm_with_tools = llm.bind_tools(tools=tools)
The model is now aware of the human-assistance tool and will call it when necessary.
💬 AI Agent Logic
def chatbot(state: State):
message = llm_with_tools.invoke(state["messages"])
return {"messages": [message]}
The chatbot receives conversation state and returns the model's next message — possibly with a call to human_assistance
.
🔄 Build the LangGraph
tool_node = ToolNode(tools=tools)
graph_builder = StateGraph(State)
graph_builder.add_node("chatbot", chatbot)
graph_builder.add_node("tools", tool_node)
graph_builder.add_edge(START, "chatbot")
graph_builder.add_conditional_edges("chatbot", tools_condition)
graph_builder.add_edge("tools", "chatbot")
graph_builder.add_edge("chatbot", END)
🧩 This defines the flow:
🟢 Start → 🤖 Chatbot
🤖 Chatbot → (calls tool?) → 🛠 Tools → back to 🤖 Chatbot → 🔴 End
🧱 Compiling the Graph
def create_chat_graph(checkpointer):
return graph_builder.compile(checkpointer=checkpointer)
We use a MongoDB checkpoint to persist and resume graph state.
👤 User Chat Interface
def user_chat():
DB_URI = "mongodb://admin:admin@mongodb:27017"
config = {"configurable": {"thread_id": "21"}}
with MongoDBSaver.from_conn_string(DB_URI) as mongo_checkpointer:
graph_with_cp = create_chat_graph(mongo_checkpointer)
while True:
user_input = input("> ")
state = State(messages=[{"role": "user", "content": user_input}])
for event in graph_with_cp.stream(state, config, stream_mode="values"):
if "messages" in event:
event["messages"][-1].pretty_print()
The user sends messages. If AI needs help, it pauses for human input.
🧑🏫 Admin Interface: Answer AI's Help Call
def admin_call():
DB_URI = "mongodb://admin:admin@mongodb:27017"
config = {"configurable": {"thread_id": "21"}}
with MongoDBSaver.from_conn_string(DB_URI) as mongo_checkpointer:
graph_with_cp = create_chat_graph(mongo_checkpointer)
state = graph_with_cp.get_state(config=config)
last_message = state.values['messages'][-1]
tool_calls = last_message.additional_kwargs.get("tool_calls", [])
user_query = None
for call in tool_calls:
if call.get("function", {}).get("name") == "human_assistance":
args = call["function"].get("arguments", "{}")
try:
args_dict = json.loads(args)
user_query = args_dict.get("query")
except json.JSONDecodeError:
print("Failed to decode function arguments.")
print("User Has a Query:", user_query)
solution = input("> ")
resume_command = Command(resume={"data": solution})
for event in graph_with_cp.stream(resume_command, config, stream_mode="values"):
if "messages" in event:
event["messages"][-1].pretty_print()
This function allows a human (admin) to step in, see what the AI needs help with, and resume execution after replying.
🧪 Try It Out!
Start the chatbot:
user_chat()
And in another terminal, run the human handler:
admin_call()
It’s like having a smart AI agent that knows when to call you in — like a junior assistant asking for mentorship.
🌟 Why HITL Matters
💡 Accuracy: Avoid hallucinations by asking a human when unsure
🛡️ Safety: Add oversight to sensitive AI decisions
⚖️ Ethics: Leave critical judgment to humans
📈 Scalability: Let AI handle most tasks, but escalate when needed
🧠 Final Thoughts
LangGraph makes it effortless to embed human supervision in AI workflows. With just one tool and one graph, you get a system that’s:
✅ Smart
✅ Trustworthy
✅ Flexible
✅ Production-ready
Sometimes, the smartest move your AI can make… is to ask you. 🧑🏫
🛠️ Built With:
🔗 LangGraph
🧠 OpenAI GPT-4.1
📦 LangChain
🧮 MongoDB (for checkpointing)
🚀 Happy coding — and don't forget to help your bots help you!
Subscribe to my newsletter
Read articles from Mayank Gautam directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Mayank Gautam
Mayank Gautam
Just a simple dev! Looking forward to learn more