A2A Innovative Protocol or Redundant Layer?Why API Gateways and MCP May Already Have Us Covered


This is a complimentary content with MCP + UTAM Blog.
Introduction
Google's Agent-to-Agent (A2A) protocol, introduced in April 2025, promises to be the "missing glue" for large, multi-framework agent ecosystems. Positioned as a vendor-agnostic "lingua franca" for AI agents, A2A borrows concepts from service meshes and aims to complement Anthropic's Model Context Protocol (MCP). But does the AI ecosystem really need another protocol layer? This analysis examines whether A2A brings genuine innovation or simply repackages existing solutions, with a critical focus on how API gateways (like Apigee) and MCP already address most use cases.
What Exactly Is A2A?
A2A is an open, HTTP-based protocol designed to enable autonomous agents to:
Discover each other through signed Agent Cards
Exchange structured task messages
Coordinate complex, potentially asynchronous workflows
Google positions it as complementary to Anthropic's MCP: While MCP standardizes tool invocation, A2A standardizes agent-to-agent conversations.
The Case For A2A: What Proponents Claim
Supporters highlight several potential benefits:
1. Standardized Interoperability
# A2A-style structured communication
def request_information(sender_agent, receiver_agent, request_schema):
"""
Standardized A2A request format with clear expectations
"""
request = {
"id": str(uuid.uuid4()),
"sender": sender_agent.id,
"action": "information_request",
"params": request_schema,
}
return requests.post(f"{receiver_agent.endpoint}/a2a", json=request).json()
A2A defines a shared JSON schema and transport protocol so agents written in different frameworks can communicate without custom integration code.
2. Agent Discovery and Capability Advertising
Agents expose capabilities, auth methods, and version information through standardized endpoints:
@app.get("/.well-known/agent-card.json")
def agent_card():
return {
"id": "urn:uuid:f3f108d3-" + str(uuid.uuid4())[:8],
"name": "DataAnalysisAgent",
"description": "Analyzes financial datasets",
"endpoints": {"http": os.getenv("BASE_URL", "http://localhost:8000") + "/a2a"},
"actions": [
{"name": "analyze_time_series", "params_schema": {"dataset_url": "string", "frequency": "string"}}
],
"security": {"auth": "Bearer", "scopes": ["data.read"]},
"version": "1.0.0",
}
3. Built-in Patterns for Asynchronous Tasks
A2A includes patterns for long-running or streamed tasks where clients can poll /status/{id}
endpoints or subscribe to webhooks:
@app.post("/a2a/status/{task_id}")
def task_status(task_id: str):
task = task_store.get(task_id)
if not task:
raise HTTPException(404, "Task not found")
return {
"id": task_id,
"status": task["status"],
"result": task.get("result"),
"completed_at": task.get("completed_at")
}
4. Governance and Observability
A2A includes standard headers for trace-IDs and policy tags that slot into enterprise observability stacks like OpenTelemetry and Cloud Logging.
The Case Against A2A: A Critical Examination
Despite these claimed benefits, there are compelling arguments that A2A largely replicates functionality already available through existing technologies:
1. "Reinventing REST": OpenAPI Already Solves This Problem
# Standard OpenAPI approach
from fastapi import FastAPI
from pydantic import BaseModel
app = FastAPI(title="DataAnalysisService", version="1.0.0")
class AnalysisRequest(BaseModel):
dataset_url: str
frequency: str
class AnalysisResponse(BaseModel):
analysis_id: str
summary: dict
status: str
@app.post("/analysis", response_model=AnalysisResponse, tags=["analysis"])
def create_analysis(req: AnalysisRequest):
# Process the analysis request
result = process_analysis(req.dataset_url, req.frequency)
return AnalysisResponse(
analysis_id=str(uuid.uuid4()),
summary=result,
status="completed"
)
OpenAPI/Swagger already provides a standardized way to document, discover, and interact with APIs. Adding another layer feels redundant.
2. API Gateways Already Handle Discovery, Security, and Routing
Consider this Kong API Gateway configuration:
services:
- name: analysis-service
url: http://analysis-service:8000
routes:
- name: analysis-route
service: analysis-service
paths: ["/analysis"]
plugins:
- name: jwt
- name: rate-limiting
config:
minute: 60
- name: cors
API gateways like Kong, Apigee, Ambassador, or Envoy already provide:
Service discovery
Authentication and authorization
Rate limiting and traffic management
Observability and logging
Protocol translation
The question becomes: What does A2A add that a well-configured API gateway doesn't already provide?
3. MCP Already Standardizes Tool Invocation
Anthropic's Model Context Protocol (MCP) already provides a standardized way for models to invoke tools. For many agent interactions, this existing protocol is sufficient:
def handle_mcp_request(mcp_request):
# MCP already has a standardized format for tool calls
if mcp_request["type"] == "tool_call":
tool_name = mcp_request["name"]
tool_params = mcp_request["parameters"]
if tool_name == "analyze_data":
# Call the appropriate function
result = data_analysis_service.analyze(tool_params["dataset"], tool_params["metrics"])
return {
"type": "tool_response",
"content": result
}
4. Security Concerns
A2A introduces new security considerations:
Agent Cards can potentially be spoofed
The protocol leaves authentication token exchange "out-of-band"
Adding another layer increases the attack surface
5. Operational Complexity
Running an A2A router, managing Agent Cards, and handling versioning adds operational overhead for DevOps and SRE teams who already manage API gateways and service meshes.
Code Comparison: A2A vs. Simple OpenAPI + API Gateway
A2A Implementation
# inventory_agent.py
from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
import uuid, os, datetime as dt
app = FastAPI(title="InventoryAgent", version="0.1.0")
class Task(BaseModel):
id: str
action: str
params: dict
class Result(BaseModel):
id: str
status: str
result: dict
completed_at: str
@app.get("/.well-known/agent-card.json")
def agent_card():
return {
"id": "urn:uuid:f3f108d3-" + str(uuid.uuid4())[:8],
"name": "InventoryAgent",
"description": "Returns on-hand inventory counts",
"endpoints": {"http": os.getenv("BASE_URL", "http://localhost:8000") + "/a2a"},
"actions": [{"name": "get_stock", "params_schema": {"sku": "string"}}],
"security": {"auth": "Bearer", "scopes": ["inventory.read"]},
"version": "0.1.0",
}
@app.post("/a2a", response_model=Result)
def handle(task: Task):
if task.action != "get_stock":
raise HTTPException(400, "unknown action")
sku = task.params.get("sku")
stock = 42 # demo
return Result(
id=task.id,
status="completed",
result={"sku": sku, "stock": stock},
completed_at=dt.datetime.utcnow().isoformat() + "Z",
)
Client call:
# client_call.py
import requests, uuid
task_id = str(uuid.uuid4())
payload = {"id": task_id, "action": "get_stock", "params": {"sku": "ABC-123"}}
r = requests.post("http://localhost:8000/a2a", json=payload, headers={"Authorization": "Bearer demo"})
print(r.json())
OpenAPI + API Gateway Implementation
# orders_service.py
from fastapi import FastAPI
from pydantic import BaseModel
import uuid
app = FastAPI(title="OrdersService", version="1.0.0")
class OrderReq(BaseModel):
sku: str
qty: int
class OrderRes(BaseModel):
order_id: str
status: str
@app.post("/orders", response_model=OrderRes, tags=["orders"])
def create_order(req: OrderReq):
return OrderRes(order_id=f"ORD-{uuid.uuid4()}", status="accepted")
API Gateway configuration (Kong):
services:
- name: orders-svc
url: http://orders:8000
routes:
- name: orders-route
service: orders-svc
paths: ["/orders"]
plugins:
- name: jwt # same auth style as other micro-services
Client call:
import requests
token = "<jwt>"
resp = requests.post("https://api.mybank.com/orders",
json={"sku":"ABC-123","qty":1},
headers={"Authorization": f"Bearer {token}"})
print(resp.json())
A Practical Middle Ground: Learning from A2A Without the Overhead
We can adopt useful concepts from A2A while avoiding unnecessary complexity:
class LightweightAgentProtocol:
"""A simplified protocol inspired by A2A concepts without the bloat"""
def __init__(self, api_gateway_url):
self.gateway_url = api_gateway_url
def discover_service(self, service_name):
"""Use existing service discovery through API gateway"""
response = requests.get(f"{self.gateway_url}/.well-known/services/{service_name}")
return response.json()
def request(self, service_name, action, payload, auth_token):
"""Lightweight service request through existing API gateway"""
service_info = self.discover_service(service_name)
endpoint = service_info["endpoints"][action]
headers = {"Authorization": f"Bearer {auth_token}"}
response = requests.post(f"{self.gateway_url}{endpoint}",
json=payload,
headers=headers)
return response.json()
When A2A Might Make Sense vs. When to Stick with Existing Solutions
Consider A2A When:
You're building a heterogeneous agent ecosystem with diverse frameworks (Python, Go, JavaScript) with no shared infrastructure
You need agent discovery outside your VPC (public federated network)
You have long-running tasks where custom polling/webhook implementation would be burdensome
Stick with OpenAPI/API Gateways When:
You have tight DevOps controls and an existing API gateway infrastructure
You operate in a regulated industry needing robust audit capabilities and static contracts
You need production-hardened authentication and authorization
You want to minimize operational complexity and security surface area
Conclusion: Pragmatic Adoption vs. All-In Investment
A2A's promise of "agent interoperability out of the box" is compelling, particularly for green-field agent networks or polyglot ecosystems. However, organizations with established API gateways, OpenAPI pipelines, and service contracts may find A2A redundant or even burdensome.
The most pragmatic approach is likely experimental adoption:
Wrap one or two agents in A2A
Run them behind your existing API gateway
Measure the friction and benefits before committing to wholesale adoption
API gateways like Apigee have spent years solving the exact problems A2A aims to address: service discovery, authentication, observability, and standardized communication. Similarly, MCP already provides a standardized way for AI models to invoke tools. Before jumping on the A2A bandwagon, organizations should carefully assess whether this new protocol truly fills gaps in their existing architecture or simply adds an unnecessary abstraction layer.
As with any new technology, the value of A2A will ultimately be determined by real-world adoption and the specific problems it solves more effectively than existing approaches. For many organizations, a lightweight approach that incorporates useful A2A concepts without the full protocol overhead may provide the best balance of innovation and practicality.
Subscribe to my newsletter
Read articles from Verum Intelligentia directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
