MCP Protocol and "MCP-USE" for you!


A dive into the Model Context Protocol (MCP), its architecture, and how mcp-use
brings seamless tool orchestration to language models.
Modern workflows are becoming monotonically agentic — they don't just generate text, they also use tools. But how do you make a language model interact with external tools, like a browser, filesystem, or APIs?
That’s where the Model Context Protocol (MCP) comes in.
Here, we'll explore:
What the MCP protocol is
How the client-server model works
What the
mcp-use
project is and how it orchestrates everythingVisual architecture diagrams
A working Python example using multiple tool servers
Whether you're building AI agents, LangChain tools, or custom LLM workflows — this protocol and project is for you.
What is MCP?
MCP (Model Context Protocol) defines a standardized way for a language model client to communicate with external tool servers.
At its core, MCP is a client-server protocol, where:
The MCP Client connects with the LLM and routes tool requests
The MCP Server hosts a set of capabilities like:
Tools (functions with arguments)
Resources (files, data)
Prompts (template injections)
MCP Client–Server Architecture
Here's the base architecture of MCP:
What is mcp-use?
mcp-use
is a Python SDK and agent framework built around the MCP protocol. It’s designed to plug any number of MCP servers into an LLM-based agent with minimal setup.
It supports:
LangChain integration
Multiple concurrent tool servers
WebSocket, HTTP, or
npx
-based transportAutomatic tool selection by the LLM
Here’s a full picture of how
mcp-use
orchestrates the moving parts:
from mcp_use import MCPAgent, MCPClient
from langchain_anthropic import ChatAnthropic
import asyncio
from dotenv import load_dotenv
async def main():
load_dotenv()
config = {
"mcpServers": {
"airbnb": {
"command": "npx",
"args": ["@openbnb/mcp-server-airbnb"],
},
"playwright": {
"command": "npx",
"args": ["@playwright/mcp"],
"env": {"DISPLAY": ":1"},
},
"filesystem": {
"command": "npx",
"args": ["@modelcontextprotocol/server-filesystem", "."],
},
}
}
client = MCPClient.from_dict(config)
llm = ChatAnthropic(model="claude-3-5-sonnet-20240620")
agent = MCPAgent(llm=llm, client=client, max_steps=20)
result = await agent.run("Find a place in Barcelona on Airbnb and save it to notes.txt")
print(result)
asyncio.run(main())
The agent chooses tools and each tool call is routed to the proper server while the output is handled via the filesystem tool.
They are:
Modular — tools are decoupled from the LLM logic
Standardized — same protocol regardless of transport
Flexible — use HTTP, WebSocket, stdin/stdout
Composable — use multiple servers in parallel
MCP + mcp-use
offer a future-proof way to empower LLM agents with tools, safely and in scalable manner. Whether you're automating research, file operations, web browsing, or launching workflows and this open protocol gives you the backbone to make your agent actually useful.
Besides this if you visit, mcp-use.com provides:
1-Click Deployments: Instantly spin up MCP servers (e.g., Playwright, FS, custom tools) using Vercel-like 1-click flows.
Unified MCP Gateway: Aggregate multiple MCP servers under one endpoint with built-in routing, auth, ACLS, metrics and observability.
Persistent & Ephemeral Server Support: Deploy short-lived servers via npx
Streaming Responses Out of the Box: Supports SSE, WebSockets, or stdio Streaming Responses Out of the Box and enables live streaming of tool outputs back to the LLM and enables live streaming of tool outputs back to the LLM.
Enterprise-Ready Security Stack: OAuth, Access Control Lists (ACLs), and rate limiting andBuilt-in token-based auth for secure gateway usage.
Built-in Monitoring & Tracing, LLM-Aware Agents, Multi-language SDK Support, Zero-Boilerplate Agent Setup and many more.
Give it a try (You don’t need any fancy tools or paid API keys, Ollama on your local machine will work more than fine)!
Happy Weekend!
Subscribe to my newsletter
Read articles from Sangam Ghimire directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
