Architecture Map
Host Application
Model (LLM)
Just sees tool definitions as text in its context. It has NO IDEA MCP exists.
MCP Client #1
Manages connection to server. Sends JSON-RPC requests.
MCP Client #2
Each server gets its own client instance.
JSON-RPC over stdio
JSON-RPC over HTTP
MCP Server (local)
Lightweight program. Often <100 lines. Exposes tools, resources, prompts.
ToolsResourcesPrompts
MCP Server (remote)
Same protocol, different transport. Uses HTTP + SSE.
ToolsResources
Live Protocol Inspector
Speed
0 / 3 messages
Press Play to watch the Init Handshake protocol exchange unfold, or Step to advance one message at a time
The Three Primitives
MCP has three primitives, not one. Each serves a different purpose with different control semantics.
Tools
Model-controlled
Functions the model can call. Like functions with side effects. The model decides when to invoke them.
JSON-RPC Methods
tools/list
tools/call
send_emailcreate_issuerun_queryedit_fileweb_search
Resources
App-controlled
Data the application can read. Like GET endpoints. The application decides when to fetch them.
JSON-RPC Methods
resources/list
resources/read
resources/subscribe
file contentsDB schemaAPI docslog filesconfig data
Prompts
User-controlled
Reusable prompt templates with dynamic arguments. The user selects them from a menu.
JSON-RPC Methods
prompts/list
prompts/get
"Summarize this repo""Review this PR""Explain this code""Generate tests"
Transport Layer
How messages physically travel between client and server. The protocol doesn't care — it only cares about the JSON-RPC message format on top.
Server runs as a subprocess. Communication over stdin/stdout pipes. Simplest possible IPC mechanism.
HOST PROCESS
MCP Client
Request
stdin
Response
stdout
CHILD PROCESS
MCP Server
stderr is used for logging — never for protocol messages.
Protocol Evolution
2024-11-05
Initial Release
MCP specification published by Anthropic. stdio + HTTP+SSE transports.
2025-03-26
Streamable HTTP
New Streamable HTTP transport replaces HTTP+SSE. Auth improvements. Wider adoption.
Future
Evolving
Active development. Community contributions. Growing ecosystem of servers and clients.
Tool Calling vs MCP
Two different layers solving two different problems. Most confusion about MCP comes from conflating these.
Function Calling
How the LLM decides what to do
1Developer defines functions with names, descriptions, and parameter schemas
2These definitions are injected into the model’s context as text
3Model reads the definitions and decides which function to call
4Model generates a structured JSON tool call as its output tokens
5The host application parses the JSON and executes the function
MCP
How tools are discovered and executed
1Host application connects to one or more MCP servers
2Client sends initialize handshake, negotiates capabilities
3Client calls tools/list to discover available tools and their schemas
4Tool schemas are injected into the model’s context (model has no idea MCP exists)
5When the model generates a tool call, the host routes it to the correct MCP server
6MCP server executes the tool and returns the result via JSON-RPC
Key Relationship
They are complementary layers — Function calling is the mechanism inside the LLM API. MCP is the protocol layer that feeds tools into that mechanism and executes them on the other side.
The Costs of MCP
MCP solves real problems, but it introduces real costs. Understanding the tradeoffs is essential.
Context Window Bloating
67,000+
tokens from 4 servers
25–30% of a 200K context window consumed before the conversation even starts. Every tool definition is injected as text. 2–3x cost increase from token overhead alone.
Progressive Disclosure
On-demand
load only when needed
The modern alternative: load tools on demand instead of all at once. Claude Skills exemplifies this pattern — capabilities are loaded contextually, not dumped into the system prompt.
Security Surface
43%
had injection vulnerabilities
30% vulnerable to SSRF. 88% of servers require credentials, yet 53% use insecure static secrets. No built-in sandboxing or permission model in the protocol itself.
The Evolving Landscape
MCP isn't the only protocol vying for the agent interop space. The ecosystem is consolidating fast.
A2A
Google, April 2025
Agent-to-Agent communication
50+ launch partners. Now under Linux Foundation governance. Development has slowed since the initial announcement.
Linux Foundation / Slowing
ACP
IBM / Linux Foundation, March 2025
Agent Communication Protocol
Merged with A2A under the Linux Foundation in September 2025. Simple REST-based design, no SDK required.
Merged with A2A
AG-UI
CopilotKit, 2025
Agent-User Interaction
Event-based protocol for agent-human interaction. Fills the UI gap that MCP deliberately ignores. Active in niche use cases.
Active / Niche
Progressive Disclosure
Anthropic (Claude Skills), 2025
On-demand capability loading
Architectural pattern, not a protocol. Directly addresses context bloating by loading tools contextually instead of all at once.
Growing adoption
Key Takeaways
  • MCP is a JSON-RPC protocol. Not a product, not a service. A standardized conversation format between client and server.
  • The model never talks to MCP servers. The host application's MCP client manages all communication. The model just sees tools and results as text in its context.
  • MCP solves the N×M integration problem. Instead of every app building custom integrations for every tool, both sides implement the protocol once. N+M replaces N×M.
  • Three primitives, three control models. Tools (model-controlled), Resources (app-controlled), Prompts (user-controlled). Each exists for a reason.
  • MCP servers are simple programs. They can be written in any language, run locally or remotely, and are often under 100 lines of code. “Server” is misleading — think “adapter” or “plugin.”
  • The lifecycle starts with a handshake. Initialization → capability negotiation → discovery → execution. The handshake is what makes the protocol dynamic and extensible.
  • Transport is a detail, not the protocol. stdio for local, HTTP+SSE for remote. The JSON-RPC messages are the same regardless of how they travel.
  • MCP is the USB-C of AI. One standard interface, infinite capabilities. Before USB: proprietary cables everywhere. Before MCP: custom integrations everywhere. After MCP: plug and play.