AI Agents

MCP Hit 97 Million Installs: Why Every Developer Should Care

Model Context Protocol just hit 97 million installs and joined the Linux Foundation. Here's what this milestone means for developers and the AI tooling ecosystem.

April 13, 20267 min read
Share:
MCP Hit 97 Million Installs: Why Every Developer Should Care

Ninety-seven million installs. That number dropped quietly in a Linux Foundation announcement and most people missed what it actually signals. This is not a vanity metric — it is evidence that MCP crossed the threshold from "interesting Anthropic experiment" to genuine developer infrastructure.

This article is about what that milestone means, how the ecosystem got there this fast, and what developers should be doing about it right now.


What MCP Is (The Short Version)

If you want the deep dive, read our Complete Guide to Model Context Protocol. The short version:

MCP is an open protocol that lets AI models connect to external tools, databases, and services through a standardized interface. Instead of every AI tool writing custom integrations with every external service — a combinatorial nightmare — you write one MCP server per service, and any MCP-compatible client can use it.

The architecture has three parts:

  • MCP Hosts — the AI applications (Claude Desktop, Claude Code, Cursor, etc.)
  • MCP Clients — protocol handlers inside the host
  • MCP Servers — lightweight processes that expose tools, resources, and prompts

The communication uses JSON-RPC 2.0 over stdio or HTTP with SSE. It is deliberately simple. A minimal MCP server fits in about 50 lines of Python or TypeScript.

# Minimal MCP server — Python
from mcp.server import Server
from mcp.server.stdio import stdio_server
from mcp.types import Tool, TextContent
 
app = Server("my-server")
 
@app.list_tools()
async def list_tools():
    return [
        Tool(
            name="get_weather",
            description="Get current weather for a location",
            inputSchema={
                "type": "object",
                "properties": {
                    "location": {"type": "string", "description": "City name"}
                },
                "required": ["location"]
            }
        )
    ]
 
@app.call_tool()
async def call_tool(name: str, arguments: dict):
    if name == "get_weather":
        location = arguments["location"]
        # Your actual weather API call here
        return [TextContent(type="text", text=f"Weather in {location}: 72°F, sunny")]
 
async def main():
    async with stdio_server() as streams:
        await app.run(*streams, app.create_initialization_options())
 
if __name__ == "__main__":
    import asyncio
    asyncio.run(main())

That is the entire model. Implement list_tools and call_tool, expose them over stdio, and any MCP client — Claude Code, Cursor, your own app — can use those tools without any further integration work.


The 97 Million Number

The 97 million installs figure came with the announcement of MCP's move to the Linux Foundation in early 2026. To put that in perspective:

Speed of adoption. MCP was open-sourced by Anthropic in November 2024. Reaching 97 million installs in roughly 16 months is unusually fast for a developer protocol. For comparison, the npm package express took years to reach similar installation numbers after its initial release. The OpenAPI specification — another "standard interface" that changed developer tooling — took several years to reach comparable adoption.

What 97 million actually measures. This counts package installs from npm (@modelcontextprotocol/sdk), PyPI (mcp), and related packages. Each install represents a developer or a CI pipeline pulling MCP dependencies to build with the protocol. It is a proxy for active development, not passive interest.

The acceleration curve. The installs did not arrive linearly. Adoption was slow in Q4 2024, accelerated in Q1 2025 when major editors added MCP support, and has been compounding since. Every new MCP-compatible host (Cursor, Windsurf, VS Code Insiders, Zed, Claude Desktop) expanded the addressable audience for MCP servers. Every new MCP server made clients more valuable. Classic platform flywheel.


The Linux Foundation Move

This is the governance change that matters more than the install count.

Anthropic created MCP and controlled the specification. Moving it to the Linux Foundation — the same neutral stewardship body that governs the Linux kernel, Kubernetes, OpenTelemetry, and hundreds of other critical infrastructure projects — changes the political economy of the protocol completely.

What changes under Linux Foundation governance

Specification ownership shifts to a neutral body. Companies that were hesitant to build critical infrastructure on top of a protocol owned by a direct competitor — specifically OpenAI, Google, and Microsoft — now have no such conflict. The spec belongs to the foundation, not to Anthropic.

Enterprise adoption accelerates. Large enterprises care about vendor neutrality in the infrastructure they adopt. "We built on an Anthropic protocol" creates procurement and security conversations that "we built on a Linux Foundation standard" does not. This is not hypothetical — it is how enterprise technology adoption has worked for decades.

Cross-company contribution becomes real. Under foundation governance, companies can contribute to the spec, propose extensions, and influence the roadmap without going through Anthropic's GitHub. The Technical Steering Committee now includes contributors from multiple organizations.

Longevity signal. Moving to an independent foundation is a commitment signal. It says: this protocol is not going away if Anthropic pivots, gets acquired, or deprioritizes MCP. For developers building systems expected to run for years, that matters.

What does NOT change

The technical specification did not change with the governance move. Your existing MCP servers keep working. Existing clients keep working. The protocol itself is stable.


The Ecosystem: What Has Been Built

The 97 million installs make more sense when you see what has been built on top of MCP. The official MCP servers repository (github.com/modelcontextprotocol/servers) now includes production-ready implementations for:

Data and storage

  • @modelcontextprotocol/server-postgres — full read/write access to PostgreSQL databases, schema inspection, query execution
  • @modelcontextprotocol/server-sqlite — SQLite with built-in query analysis capabilities
  • @modelcontextprotocol/server-filesystem — configurable file system access with sandboxing

Developer tools

  • @modelcontextprotocol/server-github — repository management, issues, PRs, file operations, code search via the GitHub API
  • @modelcontextprotocol/server-gitlab — equivalent for GitLab
  • @modelcontextprotocol/server-git — local git operations: reading diffs, history, branches

Web and search

  • @modelcontextprotocol/server-brave-search — Brave Search API integration, web and local search
  • @modelcontextprotocol/server-fetch — web content fetching with HTML-to-markdown conversion for AI consumption
  • @modelcontextprotocol/server-puppeteer — full browser automation via Puppeteer

Productivity

  • @modelcontextprotocol/server-slack — channel management, message posting, channel history
  • @modelcontextprotocol/server-google-drive — file access, search, and read operations
  • @modelcontextprotocol/server-google-maps — geocoding, directions, place details

Observability and monitoring

  • @modelcontextprotocol/server-sentry — issue retrieval and analysis from Sentry
  • @modelcontextprotocol/server-aws-kb-retrieval — querying AWS Bedrock Knowledge Bases
  • Datadog, Grafana, and Cloudflare MCP servers from the community

Beyond the official servers, the community has produced thousands more. There are MCP servers for Notion, Linear, Jira, Figma, Stripe, PlanetScale, Supabase, Neon, and virtually every major SaaS tool that developers use.


Other AI Providers Adopting MCP

The Linux Foundation move unlocked something significant: competitors adopted the protocol.

OpenAI announced support for MCP in their desktop applications and API tooling. This is the clearest signal that MCP is becoming infrastructure, not competitive advantage. When the maker of GPT-4o implements your protocol spec, the spec won.

Google integrated MCP support into Gemini's tooling ecosystem, including ADK (Agent Development Kit). Gemini-powered tools can now consume MCP servers written for Claude, and vice versa.

Microsoft added MCP support to GitHub Copilot's agent mode and to VS Code's extension for AI tooling. Copilot agents can use MCP servers to access external data.

Cursor and Windsurf — the AI-native editors — both ship with MCP support as a core feature. These are the editors that a significant percentage of active developers use daily.

The protocol convergence is real. You can now write an MCP server once and have it work with Claude, GPT-4o, Gemini, Copilot, Cursor, and Windsurf. That is the value proposition of any open standard, and it is now actually delivering.


How to Use MCP with Claude Code Today

Here is the practical part. If you are using Claude Code, adding MCP servers takes minutes.

Add a server globally

# Add the GitHub MCP server globally (available in all projects)
claude mcp add github -- npx -y @modelcontextprotocol/server-github

You will be prompted for GITHUB_PERSONAL_ACCESS_TOKEN. After that, Claude Code can read repositories, create issues, and manage PRs directly.

Add a server for a specific project

# From inside your project directory
claude mcp add postgres -- npx -y @modelcontextprotocol/server-postgres postgresql://localhost/mydb

This adds the server to .claude/settings.json in your project root, so it only activates in that project context.

Add via JSON configuration

For more control, edit ~/.claude/settings.json directly:

{
  "mcpServers": {
    "filesystem": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-filesystem", "/Users/you/projects"],
      "env": {}
    },
    "github": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-github"],
      "env": {
        "GITHUB_PERSONAL_ACCESS_TOKEN": "ghp_your_token_here"
      }
    },
    "postgres": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-postgres", "postgresql://localhost:5432/mydb"],
      "env": {}
    }
  }
}

Verify servers are connected

claude mcp list
# filesystem: connected (npx @modelcontextprotocol/server-filesystem)
# github: connected (npx @modelcontextprotocol/server-github)
# postgres: connected (npx @modelcontextprotocol/server-postgres)

Once connected, Claude Code uses the servers automatically when the task calls for it. Ask it to "check the open GitHub issues labeled bug" and it will use the GitHub MCP server. Ask it to "show me the schema for the users table" and it will query your Postgres server.

Building your own MCP server

The TypeScript SDK is the most complete:

npm install @modelcontextprotocol/sdk
import { Server } from "@modelcontextprotocol/sdk/server/index.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
import {
  CallToolRequestSchema,
  ListToolsRequestSchema,
} from "@modelcontextprotocol/sdk/types.js";
 
const server = new Server(
  { name: "my-custom-server", version: "1.0.0" },
  { capabilities: { tools: {} } }
);
 
server.setRequestHandler(ListToolsRequestSchema, async () => ({
  tools: [
    {
      name: "get_customer",
      description: "Fetch customer data by ID",
      inputSchema: {
        type: "object",
        properties: {
          customerId: { type: "string", description: "Customer UUID" },
        },
        required: ["customerId"],
      },
    },
  ],
}));
 
server.setRequestHandler(CallToolRequestSchema, async (request) => {
  if (request.params.name === "get_customer") {
    const { customerId } = request.params.arguments as { customerId: string };
    // Your database call here
    const customer = await db.customers.findById(customerId);
    return {
      content: [{ type: "text", text: JSON.stringify(customer, null, 2) }],
    };
  }
  throw new Error(`Unknown tool: ${request.params.name}`);
});
 
const transport = new StdioServerTransport();
await server.connect(transport);

Build it, point Claude Code at it, and you have a custom tool that Claude can use for any task in that project.


What Is Coming Next

The MCP roadmap, now shaped by the Linux Foundation Technical Steering Committee, has several items in active development:

Authentication and authorization. The current protocol has minimal auth — you pass tokens as environment variables or configure them manually. The roadmap includes native OAuth 2.0 support within the protocol, so MCP servers can participate in standard enterprise auth flows without workarounds.

Remote server discovery. Right now, you have to know about an MCP server and configure it manually. The roadmap includes a discovery mechanism — similar to service registries — where clients can find available servers programmatically.

Multi-server orchestration. When you have multiple MCP servers connected, the current model is flat: the client calls whichever server has the right tool. The roadmap includes richer orchestration semantics for more complex multi-server workflows.

Streaming and long-running operations. The protocol currently models tools as request/response. The roadmap extends this with streaming support for tools that return incremental output and for long-running operations that report progress.

Sampling and server-side LLM calls. MCP servers will be able to request LLM completions from the host client, not just expose tools. This enables more sophisticated server logic without requiring the server to manage its own LLM API access.

These are not speculative future ideas — they are tracked issues and accepted proposals in the MCP specification repository under Linux Foundation governance.


Why This Matters: MCP Is Becoming Infrastructure

There is a pattern in the history of computing where a protocol crosses a certain adoption threshold and stops being a technology choice and starts being an assumption. HTTP is not something you choose to use for web communication — it is the assumption every browser and server is built on. REST is not something you choose for API design — it is the default mental model. SQL is not something you choose for relational databases — it is the language.

MCP is crossing that threshold in AI tooling.

When OpenAI, Google, Anthropic, Microsoft, and the major AI editors all implement the same protocol, the question stops being "should I use MCP?" and starts being "which MCP server do I need?" It becomes the assumption.

The 97 million installs are the numerical evidence that this shift is already underway. The Linux Foundation move is the institutional evidence. The multi-vendor adoption is the strategic evidence.

For developers, the practical implication is straightforward: any tool integration you build for AI tooling should be an MCP server. Not a custom integration. Not an OpenAI plugin. An MCP server — because that one implementation now works with every AI client that matters.

The developers who understood HTTP early built the web. The developers who understood REST early defined how APIs work for a decade. The developers who understand MCP now are building the tooling layer that every AI application will run on.


Getting Started

If you have not used MCP yet, the path is simple:

  1. Read our Complete Guide to MCP for the fundamentals
  2. Read How to Add MCP Servers to Claude Code for the setup walkthrough
  3. Add one server today — the GitHub or filesystem server is the easiest starting point
  4. When you have a tool integration to build for your own project, build it as an MCP server

The protocol is stable. The ecosystem is real. The governance is solid. There is no good reason to wait.


The Model Context Protocol specification and official server implementations are hosted at github.com/modelcontextprotocol. The MCP TypeScript SDK and Python SDK are available on npm and PyPI respectively.

#mcp#model-context-protocol#ai-agents#claude-code#anthropic
Share:

Enjoyed this article?

Join 2,400+ developers getting weekly insights on Claude Code, React, and AI tools.

No spam. Unsubscribe anytime. By subscribing you agree to our Privacy Policy.