When Anthropic released the Model Context Protocol in late 2024, the announcement was met with interest but little fanfare. It was positioned as a developer tool — a technical standard for connecting AI agents to external systems. Eighteen months later, MCP has crossed 97 million installations and is being described by engineers, architects, and CTOs as the connective tissue of the modern AI stack.
That kind of trajectory does not happen by accident. It happens when a piece of infrastructure solves a real problem that thousands of teams are independently struggling with at the exact right moment. That is precisely what MCP did.
What Is MCP, and Why Does It Matter?
MCP — Model Context Protocol — is an open standard that defines how AI agents communicate with external tools, data sources, APIs, and live systems. Think of it as USB-C, but for AI: a universal interface that eliminates the need for custom, one-off integrations between models and the world around them.
Before MCP, every team building an AI agent faced the same unglamorous problem. Connecting an agent to a database required custom code. Connecting it to a CRM required different custom code. Connecting it to a ticketing system, a calendar, a file store — each integration was its own project. Every time the underlying model changed, those integrations often broke.
MCP broke that cycle by providing a single, stable interface layer. Agents built on MCP can connect to any MCP-compatible server. Tool builders who expose their APIs through an MCP server immediately become accessible to every MCP-compatible agent. The network effects compound in both directions.
The USB-C Analogy — Why It Works
USB-C succeeded not because it was the most technically elegant connector ever designed, but because it was universal enough that both device makers and cable makers had an incentive to adopt it. Once critical mass was reached, the standard became self-reinforcing. MCP is following the same logic: model-agnostic design, open specification, and community-built connectors have created the same kind of self-reinforcing ecosystem in the AI tooling space.
Why 97 Million Is Not Just a Vanity Metric
Raw installation counts are easy to dismiss. But 97 million installations of a developer infrastructure tool tells a specific story that is worth unpacking.
It means thousands of engineering teams are now building on the same connective layer. An agent built today on MCP can plug into tools and data sources that did not exist when the agent was first designed — as long as those tools expose an MCP server. The ecosystem is compounding: every new MCP-compatible tool makes every existing MCP-compatible agent more capable, without the agent itself needing to change.
This is exactly what infrastructure adoption looks like when it crosses the tipping point. We saw it with REST APIs in the early 2000s — suddenly, every web service was accessible through a common pattern, and entire categories of products became possible that had been impractical before. We saw it again with Docker in the mid-2010s, when container standardization transformed how software was shipped. MCP is following that same arc.
The compounding effect in practice: As of April 2026, the community has built ready-made MCP servers for GitHub, Slack, Notion, PostgreSQL, Jira, Google Drive, Salesforce, and dozens of other enterprise tools. A team building an AI agent today does not write database connectors or API integrations — they connect to existing MCP servers. The time from "idea" to "agent with real-world capabilities" has collapsed from weeks to hours.
Three Forces Behind the Growth
The 97 million figure is the result of three forces converging simultaneously, each of which would have been significant on its own. Together, they created a growth trajectory that caught even optimistic observers off guard.
Model-Agnostic Design
MCP was designed from the beginning to work with any large language model — Claude, GPT-4o, Gemini, Mistral, and the full range of open-source models. This was a deliberate and consequential architectural choice. Enterprises do not commit to a single model family. They run multiple models for different use cases, hedge against vendor lock-in, and switch models as the competitive landscape shifts. A standard that only works with one provider has a ceiling. A standard that works with all of them has no ceiling.
The Agent Economy Became Real
In 2025, AI agents moved from conference demos to production deployments. Real workflows. Real money. Real accountability. When agents need to actually do things — query a live database, send a notification, update a record, pull a report — the integration problem becomes urgent rather than theoretical. MCP was ready when urgency arrived. The demand pulled the adoption, not the other way around.
Open-Source Flywheel
Because the specification is open, the community built the connectors. No single company needed to build and maintain integrations with every tool in the enterprise stack — thousands of developers did it collaboratively. The result is a library of ready-made MCP servers that covers most of what enterprise teams actually need. You do not build the bridge anymore. You just cross it.
What This Signals About the State of AI Infrastructure
The milestone of 97 million installations is more than a growth story. It is a signal about where the AI ecosystem is in its maturation curve.
Early AI tools were powerful but siloed. A language model could generate text, answer questions, summarize documents — but it could not take meaningful action in the world without a substantial amount of custom engineering around it. That engineering work was duplicated across thousands of teams, each building slightly different versions of the same integration plumbing.
What MCP represents is the standardization of that plumbing. Early AI tools were silos. What is emerging now is an interconnected fabric — where models, tools, memory systems, and data sources speak the same language. That is a qualitative shift in what is possible, not just a quantitative one.
Interoperability at this scale means that the capabilities of individual agents are no longer bounded by what any single team has the resources to integrate. They are bounded by the size of the MCP ecosystem — which, at 97 million installations, is already enormous and still growing.
The Bottom Line
The race in AI is no longer only about which model is the most capable. It is increasingly about which systems are the most connected — which agents can reach the most tools, act on the most data, and operate across the most surfaces without requiring custom engineering for every new integration.
MCP is infrastructure. And the consistent lesson of technology history is that whoever establishes the infrastructure layer shapes what gets built on top of it. At 97 million installations, MCP has established that layer for AI agents.
97 million installations is a milestone. The real story is the decade of compounding that follows from having a universal connective standard in place before the agent economy reaches its full scale.
Frequently Asked Questions
MCP is an open-source standard created by Anthropic that defines how AI agents connect to external tools, APIs, databases, and live systems. It provides a universal interface layer that eliminates the need for custom integrations between AI models and the tools they need to interact with. It is model-agnostic — meaning it works with Claude, GPT, Gemini, Mistral, and open-source models alike.
For a developer infrastructure standard, 97 million installations indicates ecosystem-level adoption — the point at which network effects become self-reinforcing. Every new MCP-compatible tool increases the value of every existing MCP-compatible agent, and vice versa. This compounding dynamic is what infrastructure standards look like when they reach critical mass, comparable to the adoption curves of REST APIs and Docker containers in previous decades.
The community has built MCP servers for a wide range of enterprise tools including GitHub, Slack, Notion, PostgreSQL, Jira, Google Drive, Salesforce, and many others. Because the specification is open, new integrations are added continuously by the developer community. The full list of available MCP servers is maintained in the open-source MCP repository.
No. MCP is model-agnostic by design. While it was created by Anthropic, the standard works with any large language model that implements it — including models from OpenAI, Google, Mistral, Meta, and the broader open-source ecosystem. This neutrality was a deliberate design choice that has been central to its broad adoption.
Related Articles: