Back to Blog🇮🇹 Leggi in Italiano

The Model Context Protocol: How Claude Talks to Your Tools

MCP is Anthropic's open protocol. Here's what it is, the four MCP servers we ship in the VibeCoded Orchestrator, and how our agents call them.

Before MCP

Language models produce text. They can't read files, query databases, or fetch web pages without a bridge to the outside world. Before 2024, each vendor invented their own bridge: OpenAI's function calling, IDE plugin schemas, vendor-specific integrations rebuilt for each platform.

What MCP is (and is not)

Model Context Protocol (MCP) is an open specification announced by Anthropic in November 2024. The current spec is dated November 25, 2025, governed via Spec Enhancement Proposals and Working Groups on GitHub. The 2026 roadmap covers remote transport scaling, the new Tasks primitive for long-running work, and enterprise auth/audit.

MCP is the protocol our products speak. An MCP server exposes three primitives:

  • Tools: callable functions with JSON schemas
  • Resources: readable data
  • Prompts: reusable templates

A client (Claude Code, Cursor, etc.) launches the server, reads schemas, and decides when to call. Wire format: JSON-RPC 2.0. Transport: stdio (local subprocess) or HTTP (remote). MCP is heavier than function calling—you run a separate process—but provider-independent and model-agnostic.

The four servers

The VibeCoded Orchestrator ships four MCP servers:

  1. weaviate-kg: semantic search over local knowledge and code graphs. Tools: hybrid_search, semantic_graph_search, store_knowledge_node, search_code_graph, query_code_structure. Embeddings: qwen3-embedding:0.6b (text, 1024-d) and CodeSage-Large-v2 (code, 2048-d).

  2. ollama: local LLM inference, free. Tools: chat, read_document, read_image. Use for summaries, rewrites, vision instead of Claude API tokens.

  3. search: web, code, and academic search. Tools: web_search (SearXNG), search_code (GitHub), search_papers (OpenAlex, arXiv), fetch_page. No API keys required.

  4. code_embedding: FastAPI service for CodeSage-Large-v2 embeddings. Backend for the code graph.

How agents actually use them

A concrete trace. You ask: "How did we handle rate limiting in past projects?"

  1. Claude reads the weaviate-kg schema, picks hybrid_search.
  2. Sends {"query": "rate limiting patterns", "limit": 5} over stdio as a JSON-RPC call.
  3. The server embeds the query, runs hybrid keyword + vector search across your KG, returns 5 nodes with excerpts.
  4. Claude synthesizes an answer grounded in those real nodes — not hallucinated.

Agents use chat for local rewrites, web_search for live docs, search_code for GitHub examples, query_code_structure for refactoring context. Agents call MCP tools instead of grepping. That's the point.

Adding the team layer

These four cover single-developer workflows. For team coordination—shared decisions, cross-machine messaging, decision rationale—we ship a separate product, the Coordination MCP, backed by Supabase. Self-host free, managed pricing TBD. Same protocol. See /products/coordination-mcp.

A planned Telegram Module MCP will expose Telegram chat as a tool surface to drive the orchestrator from your phone.

Where we fit in the MCP ecosystem

By early 2026 MCP is the dominant cross-vendor standard for AI tool access. OpenAI adopted MCP in March 2025 and extended support to ChatGPT apps in September 2025. Public registries index thousands of community servers.

ClientMCP supportOpen source
Claude Desktop / Claude CodeNativeProtocol yes, client no
VibeCoded OrchestratorNative, 4 servers + optional Coord MCPYes
Cursor / WindsurfYesNo
Cline / Zed / Continue.devYesYes
VS Code + CopilotYesPartial
OpenAI ChatGPT / AppsYes (since 2025)No
LangChain / LangGraphVia adaptersYes

Every Claude Code MCP server works with the Orchestrator. We ship defaults; you can add community servers. Function calling survives for single-vendor, low-latency use. For anything model-independent, MCP wins.

The four MCP servers above are free with the VibeCoded Orchestrator — local, open, under your control. The team-sync layer is the Coordination MCP.

Sources: