Unlocking your Developer Abilities with Model Context Protocol (MCP)

UV PantaUV Panta
6 min read

Image Credit: Aravind Putrevu

Image Credit: Aravind Putrevu

TL;DR MCP is the USB‑C for AI agents: a simple, open standard that lets large‑language‑model (LLM) assistants reach outside the chat box and safely operate real tools — your file‑system, GitHub issues, databases, SaaS APIs, you name it. With a couple of lines in a config file you can go from “write me a script” to “run ESLint on my repo and open a pull‑request.”

🚀 A Quick Glimpse of the Magic

Imagine this: you highlight a buggy function in VS Code and type

“Claude (or Cursor), please refactor this with proper error‑handling, test it, and commit the patch.”

Within seconds:

  1. The LLM calls the Filesystem Server through MCP to open the file, edits the code, and writes it back.

  2. It invokes the Git Server to push the branch.

  3. All the while it keeps you in the loop, asking confirmation at each step. (Although you can make it autonomous)

That end‑to‑end flow takes zero manual shell commands — because MCP wires the AI straight into your dev toolbox.

📖 MCP in One Paragraph

Model Context Protocol (MCP) is an open, language‑agnostic spec created by Anthropic for connecting LLMs to external capabilities. MCP defines how three actors talk:

Host — The app you live in (e.g. Claude Desktop, Cursor AI).
Client — The connector inside the host that speaks MCP.
Server — A tiny program that exposes abilities (file I/O, GitHub issues, databases…)

Their are two modes of communiation with MCP servers:

  • stdio → local processes similar to CLI commands (simplicity & full desktop permissions)

  • SSE → remote/cloud servers (scalable & language‑agnostic)

🛠️ Architecture at a Glance

╭── Host (Claude Desktop / Cursor) ────────────────────────────────╮
│  User asks → “Search my repo for TODOs”                          │
│  ┌──────────────┐                                                │
│  │ MCP Client   │──stdio──▶ Filesystem Server ◀───┐              │
│  └──────────────┘                                │               │
╰───────────────────────────────────────────────────┴──────────────╯
                                                 │
                                    returns list of matches

🔌 Adding Ready made Servers

You don’t have to write a single line of TypeScript or Python to get value out of MCP. The community already ships dozens of “ability packs” (servers) that expose everyday dev tasks through a simple RPC interface. All you do is tell Cursor where the server binary lives (for local use) or what URL it’s streaming on (for remote).

Where does the config live? (eg, in Cursor)

The cursor looks for an MCP configuration in two places:

+------------------+-----------------------+-------------------------------------------------------------+
| Scope            | Path                  | When to use                                                |
+------------------+-----------------------+-------------------------------------------------------------+
| Project-specific | ./.cursor/mcp.json    | Keep experiments or secrets isolated to one codebase.      |
| Global           | ~/.cursor/mcp.json    | Re‑use the same servers across every project on your machine.|
+------------------+-----------------------+-------------------------------------------------------------+

The JSON schema is identical in both files.

Similar to Cursor, Claude, VS Code or any other Agent-supported IDEs stores the MCP configuration in a similar fashion.

Example: Filesystem + GitHub servers

// ~/.cursor/mcp.json
{
  "servers": {
    // Local server over stdio — full desktop permissions
    "filesystem": {
      "transport": "stdio",
      "command": "npx @anthropic-ai/mcp-filesystem",
      "args": ["--paths", "~/projects/my-app"]
    },    // Cloud server over SSE — scoped token auth
    "github": {
      "transport": "sse",
      "url": "https://mcp.composio.dev/github",
      "env": { "GITHUB_TOKEN": "${env:GH_TOKEN}" }
    }
  }
}

Save the file and the Cursor hot‑reloads the servers. Press ⌘⇧P → “List MCP Abilities” to confirm they’re active. Sometimes, you need to restart the application.

+-------------------------+---------------------------------------------------------------+
| Ability                 | How to install / add                                          |
+-------------------------+---------------------------------------------------------------+
| Filesystem (local)      | npx @anthropic-ai/mcp-filesystem                              |
| GitHub                  | npx mcp-github OR URL https://mcp.composio.dev/github         |
| Postgres / Neon         | npx mcp-neon                                                  |
| Email (Resend)          | npx mcp-resend                                                |
| Redis KV (Upstash)      | npx mcp-upstash                                               |
+-------------------------+---------------------------------------------------------------+

Tip: The Cursor docs keep a living catalogue of community servers where you can copy‑paste ready configs — including niche tools like Stripe, Notion, or Docker Compose.

How does Cursor pick a server?

When you write something like:

“Search the project for deprecated React lifecycle methods and open an issue for each file.”

Cursor:

  1. Parses the request and sees it needs Filesystem, then GitHub abilities.

  2. Finds servers in mcp.json that implement those abilities.

  3. Streams tool calls, prompting for confirmation after each major step.

Because the routing is declarative, you can swap a local Filesystem server for a remote one (perhaps running in CI) without changing a single prompt.

🧑‍💻 Build Your Own MCP Server

MCP is open‑ended — you can wire an LLM into any API, database, or internal tool by packaging a lightweight server that speaks the protocol.

Quick‑start tutorials

Deployment models at a glance

+-------------------------------+-----------+---------------------------------------------------+
| Model                         | Transport | Ideal for                                        |
+-------------------------------+-----------+---------------------------------------------------+
| Local dev                     | stdio     | Rapid prototyping with full desktop permissions   |
| Docker container              | sse       | Reproducible CI/CD tasks and team‑wide sharing    |
| Serverless (Cloud Run, etc.)  | sse       | Spiky workloads and pay‑per‑use operations        |
| Dedicated VM / Kubernetes     | sse       | Always‑on production agents with custom scaling   |
+-------------------------------+-----------+---------------------------------------------------+

Pro tip: Start with a local stdio server, add auth & rate‑limits, then ship a container to your cloud provider once you’re ready to share it with the team.

🏢 Hosting & Discovery Platforms

Use at your own risk: These services are community‑run or early‑stage. Always audit the source, scope tokens narrowly, and prefer self‑hosting for sensitive workloads.

  • Docker Hub — Explore a curated collection of 100+ secure, high-quality MCP servers as Docker Images.

  • Composio Hub — Dozens of ready‑to‑stream servers (GitHub, Jira, etc.); free & paid tiers with per‑server rate limits.

  • Smithery Cloud — Curated gallery of pre‑hosted MCP servers for popular SaaS APIs; one‑click copy‑to‑Cursor config.

  • Cursor Directory — Official SSE URLs for GitHub, Filesystem, Postgres, and more; zero install, authenticate via OAuth or PAT.

  • Resend MCP — Send emails directly from Cursor with this email sending MCP server.

  • Serverless runtimes (Lambda, Cloud Run, Vercel, Netlify, Fly.io) — Bring‑your‑own code; deploy over SSE for low idle cost.

🌟 Real‑World Use Cases

  • Issue Triage Bot — A GitHub MCP server lets your LLM label, cluster, and auto‑respond to new issues each morning.

  • Pull‑Request Reviewer — The same GitHub server streams diffs so the assistant can run linters, leave inline comments, and approve or request changes.

  • GKE Cluster Doctor — A custom GKE server exposes live kubectl data; the assistant surfaces crashing pods and proposes quick‑fix commands like kubectl rollout restart.

  • AWS Cost Guardian — An AWS server taps Cost Explorer and CloudWatch; each day the bot posts a Slack summary of unusual spend spikes.

  • Release‑Notes Generator — Combining GitHub and Filesystem servers, the agent compiles merged PR titles into human‑friendly release notes and opens a changelog PR.

  • Meeting Minutes Synthesizer — A Google Meet Notes server streams transcripts so the assistant publishes action items straight to Confluence or Notion.

  • Terraform Plan Explainer — Filesystem + Terraform servers let the bot translate a terraform plan into plain‑English risk and impact statements.

  • CI/CD Fixer — A Buildkite or GitHub Actions server grants log access; the agent pinpoints failing steps, suggests fixes, and can rerun the job on approval.

📚 Further Reading & Resources

Closing Thought

USB‑C didn’t just simplify charging — it unleashed an ecosystem of plug‑and‑play gadgets. MCP is doing the same for AI: every server you add snaps a fresh superpower onto your assistant.

Today, that might mean auto‑drafting PR reviews; tomorrow, spinning up disposable GKE clusters or summarising your stand‑ups before the first coffee.

What will you plug in next? Share your experiments, lessons learnt, and wish‑lists in the comments, and let’s blueprint the next generation of tool‑augmented development — together.

0
Subscribe to my newsletter

Read articles from UV Panta directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

UV Panta
UV Panta