MCP vs API Plugins vs Function Calling — What's the Difference?
If you're building AI-powered tools, you've probably encountered multiple ways to connect language models to external data: MCP servers, ChatGPT plugins, function calling, and plain API wrappers. Here's how they compare and when to use each.
The Landscape
| Approach | Standard | Runtime | Discovery | Multi-client | |----------|----------|---------|-----------|-------------| | MCP | Open protocol | Local process or remote | Automatic (tools, resources) | Yes (Claude, Cursor, Windsurf, etc.) | | ChatGPT Plugins | OpenAI proprietary | Remote HTTP | OpenAPI spec | No (ChatGPT only) | | Function Calling | Per-provider API | In API request | Manual (schema in request) | No (per-provider) | | API Wrappers | None | Varies | None | N/A |
MCP (Model Context Protocol)
MCP is an open standard for connecting AI to tools. An MCP server is a lightweight process that exposes:
- Tools — functions the AI can call (query database, create file, search web)
- Resources — data the AI can read (file contents, database schemas)
- Prompts — reusable templates for common tasks
Strengths:
- Open standard — works with any compatible client
- Local-first — servers run on your machine, no data leaves your network
- Composable — mix and match servers for your workflow
- Discovery — clients automatically see available tools
Best for: Developer workflows, local tooling, multi-client setups
Browse 1800+ MCP servers on awesome-mcp.tools.
ChatGPT Plugins (Deprecated)
OpenAI's plugin system for ChatGPT. Plugins are remote HTTP services described by an OpenAPI spec. OpenAI deprecated plugins in favor of GPTs and Actions.
Strengths:
- Web-based — no local installation
- OpenAPI standard for description
Weaknesses:
- ChatGPT only — no other clients
- Deprecated — replaced by GPT Actions
- Remote only — data must leave your network
- Slow approval process
Best for: Historical context only. Plugins are deprecated.
Function Calling
The most common approach in API-based AI applications. You define function schemas in your API request, and the model returns structured calls.
# OpenAI example
response = openai.chat.completions.create(
model="gpt-4",
messages=[...],
tools=[{
"type": "function",
"function": {
"name": "get_weather",
"parameters": {"type": "object", "properties": {...}}
}
}]
)
Strengths:
- Simple — just JSON schemas in API calls
- Flexible — works with any backend
- Well-supported — all major providers (OpenAI, Anthropic, Google)
Weaknesses:
- Manual — you define and manage tools yourself
- No discovery — client doesn't know what tools exist
- Per-request — schemas sent with every API call
- No standard — each provider has slightly different formats
Best for: Custom AI applications, production backends, specific integrations
API Wrappers / LangChain Tools
Generic approach: wrap any API in a tool definition that your AI framework understands.
# LangChain example
from langchain.tools import Tool
weather_tool = Tool(
name="weather",
func=get_weather,
description="Get weather for a city"
)
Strengths:
- Maximum flexibility
- Works with any API
- Framework-specific optimizations
Weaknesses:
- No standard — tied to your framework
- Manual maintenance
- No discoverability
Best for: Complex orchestration, custom pipelines, framework-specific apps
When to Use What
Choose MCP when:
- You want tools that work across multiple AI clients
- You need local-first execution (privacy, speed)
- You're building developer tools
- You want a growing ecosystem of pre-built servers
Choose Function Calling when:
- You're building a custom AI application via API
- You need precise control over tool schemas
- You're in production with specific requirements
Choose API Wrappers when:
- You're using LangChain/LlamaIndex/similar frameworks
- You need complex orchestration
- Tools are tightly coupled to your application logic
The Future
MCP is gaining momentum as the standard for AI-tool connectivity. With adoption from Anthropic (Claude), Cursor, Windsurf, Zed, and others, it's becoming the default way to extend AI assistants.
The ecosystem is growing fast — there are over 1800 MCP servers available today, covering everything from databases to browser automation to cloud platforms.
Browse the catalog or submit your own server to join the ecosystem.