TL;DR: We are launching MCP over API, a new way to connect your AI applications to powerful tools. This is a big deal because it makes it easier than ever to build powerful AI applications. We’ve also released official TypeScript and Python SDKs to simplify integration. Check out our SDK and code examples in our API documentation.
Serving MCP Over API: A Complete Guide
We are thrilled to announce the launch of Model Context Protocol (MCP) over APIs! This groundbreaking development bridges the gap between the powerful capabilities of MCP and the universal accessibility of REST APIs. In this comprehensive guide, we’ll explore why this matters, how it works, and how you can start implementing it in your applications today.
Universal LLM Compatibility
ToolRouter supports all major LLM providers out of the box, making it the most flexible tool routing solution available:
- OpenAI - Seamlessly connect ChatGPT and GPT-4 models to your tools
- Anthropic - Full support for Claude models with native tool calling
- Openrouter - Access to dozens of frontier models through a single integration
This universal compatibility means you can select the best model for your specific needs while maintaining a consistent tool integration approach. As LLM capabilities evolve, your ToolRouter implementation remains future-proof.
Extensive Tool Library At Your Fingertips
ToolRouter provides instant access to 30+ production-ready tools that cover virtually every use case:
- Productivity - Google Drive, Gmail, Google Calendar, Notion, Trello, Todoist
- Development - GitHub, GitLab, Railway, E2B, Context7, Postgres, Neon
- Design - Figma with 25+ operations for comprehensive design workflows
- Data & Research - Exa Search, Perplexity, Brightdata, HyperBrowser
- Communication - Slack, Discord, X/Twitter, VAPI
- E-commerce - Shopify, Airbnb integrations
- And more - Our tool library is constantly expanding
This means you can start building powerful AI applications immediately without having to develop or host individual tool integrations. Access to this entire ecosystem is available through a single, consistent API.
The Evolution of MCP: From Local to Global
The Model Context Protocol has revolutionized how AI models interact with tools and data sources, creating a standardized interface for LLMs to access external capabilities. However, until now, there’s been a significant limitation: MCPs were primarily designed for local environments.
As more developers build production applications powered by AI, the need for a more scalable, cloud-native approach to MCP became evident. Our solution? Bringing MCP to the web through standardized REST APIs.
Why MCP Over API Matters
The Universal Translator for LLMs and Tools
MCP functions as a universal translator that helps every LLM understand what a tool has to offer. Think of it as a USB-C for Function Calling Tools - a standardized interface that ensures compatibility across different systems. This standardization is crucial for building consistent AI experiences.
When we say MCP is a translator, we mean it literally: it takes the capabilities of a tool and presents them in a format that any LLM can understand and use effectively. This abstraction layer means developers don’t need to worry about the specifics of how each LLM handles tool calling differently.
Multi-LLM Support Without Refactoring
One of the most powerful aspects of ToolRouter is its ability to adapt to different LLM formats automatically:
- OpenAI’s function calling - ToolRouter seamlessly maps to OpenAI’s function calling format
- Anthropic’s tool use - Full compatibility with Claude’s native tool use API
- Openrouter’s unified API - Connect to numerous frontier models through a single integration
This means you can:
- Switch between LLM providers without changing your tool integrations
- Use multiple LLMs simultaneously with the same set of tools
- Future-proof your application against changes in LLM tool calling implementations
With a single ToolRouter integration, your application becomes instantly compatible with the entire ecosystem of modern AI models.
The Scalability Challenge
As organizations deploy AI applications to production, a critical challenge emerges: how do you manage MCPs at scale? Consider this scenario:
- You have a production application running across 100 instances
- Each instance needs access to the same set of tools
- Installing and maintaining MCP servers on each instance is impractical
- Security and versioning become exponentially more complex
In this environment, local MCPs simply don’t scale. You can’t reasonably install and manage MCP servers across all your production instances without significant operational overhead.
Hosted MCPs: The Enterprise Solution
Hosted MCPs solve this scalability problem elegantly. By centralizing your MCP servers and exposing them through APIs, you can:
- Manage all your tools in one place
- Ensure consistent versioning across all instances
- Implement robust security controls at a single entry point
- Scale your application independently from your tool infrastructure
The Perfect Marriage: LLMs Love MCP & Apps Love APIs
This approach gives you the best of both worlds:
- For LLMs: The structured, semantic understanding provided by MCP that makes tools truly useful
- For Applications: The familiar, scalable, and secure access pattern of REST APIs
This combination preserves all the benefits of MCP while eliminating the deployment complexities, making enterprise-grade AI applications more feasible than ever.
How to Implement MCP Over API
Getting started with MCP over API is straightforward with ToolRouter. Here’s a step-by-step guide:
1. Create Your Tool Stack
- Sign in to https://app.toolrouter.ai
- Navigate to the “Stacks” section
- Create a new stack (collection) containing all the MCP servers you need
- Configure each tool with the necessary credentials and settings
A stack functions as a logical grouping of tools that your application can access through a single API endpoint. This simplifies management and allows you to create different tool sets for different use cases or applications.
2. Generate Access Credentials
Once your stack is configured:
- Go to the “Connect” section
- Generate an API key and token for accessing your stack
- Store these credentials securely according to best practices
Your API key identifies your account, while the token provides secure access to specific stacks. Together, they ensure that only authorized applications can access your tools.
3. Integrate with Your Application
With your credentials in hand, you can now integrate MCP capabilities into your application using two primary endpoints:
Listing Available Tools
// Example: Listing available tools in your stack
async function listTools(clientId, token) {
const response = await fetch(`https://api.toolrouter.ai/s/${clientId}/list_tools`, {
method: 'GET',
headers: {
'Authorization': `Bearer ${token}`
}
});
const result = await response.json();
return result.tools;
}
This endpoint returns a comprehensive list of all tools available in your stack, including their names, descriptions, required parameters, and return types. This information can be passed directly to LLMs to enable them to use these tools effectively.
Calling Tools
// Example: Calling a tool with parameters
async function callTool(clientId, token, toolName, toolInput) {
const response = await fetch(`https://api.toolrouter.ai/s/${clientId}/call_tool`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${token}`
},
body: JSON.stringify({
tool_name: toolName,
tool_input: toolInput
})
});
const result = await response.json();
return result;
}
This endpoint executes the specified tool with the provided parameters and returns the result. This can be used both in AI agent workflows and in traditional application logic.
4. Use Our Official SDKs
To simplify integration even further, we’ve created official SDKs for Python and TypeScript that handle all the API communication details for you.
TypeScript/JavaScript SDK
Our TypeScript SDK is available on npm and provides a clean, type-safe interface to the ToolRouter API:
npm install toolrouter
Basic usage:
import { ToolRouter } from 'toolrouter';
// Initialize with your credentials
const router = new ToolRouter('your-client-id', 'your-api-key');
// List available tools
const tools = await router.listTools();
console.log('Available tools:', tools);
// Call a specific tool
const result = await router.callTool('example-tool', {
param1: 'value1',
param2: 'value2'
});
console.log('Tool result:', result);
For convenience, we also provide helper functions:
import { setupDefaultRouter, listTools, callTool } from 'toolrouter';
// Set up once
setupDefaultRouter('your-client-id', 'your-api-key');
// List tools and call them using the default router
const tools = await listTools();
const result = await callTool('example-tool', { param1: 'value1' });
Python SDK
Our Python SDK is available on PyPI:
pip install toolrouter
Basic usage:
from toolrouter import ToolRouter
# Initialize with your credentials
router = ToolRouter(client_id="your-client-id", api_key="your-api-key")
# List available tools
tools = router.list_tools()
print("Available tools:", tools)
# Call a specific tool
result = router.call_tool("example-tool", {"param1": "value1", "param2": "value2"})
print("Tool result:", result)
Both SDKs provide:
- Type-safe interfaces for tool parameters and responses
- Automatic handling of authentication
- Comprehensive error handling
- Detailed documentation and examples
Visit our SDK Documentation for complete reference and advanced usage examples.
Integration Patterns
There are several ways to integrate MCP over API into your applications:
Direct AI Agent Integration
In this pattern, your AI agent communicates directly with the ToolRouter API:
Code Examples: Python | JavaScript
- AI agent needs a capability (e.g., search the web)
- Application fetches available tools from ToolRouter API
- Tools are presented to the AI in its prompt
- AI decides to use a tool and generates a tool call
- Application sends the tool call to ToolRouter API
- Result is returned and presented back to the AI
This approach works well for applications where the AI agent drives the tool usage, similar to how ChatGPT plugins work.
Application-Mediated Integration
In this pattern, your application controls when tools are used:
- Application determines a tool is needed based on business logic
- Application calls the appropriate tool via ToolRouter API
- Result is used by the application, which may include presenting it to an AI
This approach works well for more structured workflows where tool usage is predetermined by application logic.
Hybrid Approach
Many sophisticated applications use a hybrid approach:
- Application defines which tools are available in different contexts
- AI agent can choose from this curated set of tools
- Application maintains control over security and resource usage
- AI maintains flexibility in how it accomplishes tasks
Security Considerations
When implementing MCP over API, security becomes even more important as your tools are now accessible over the internet. ToolRouter implements several security layers:
Authentication and Authorization
- API Keys: Identify your account and control overall access
- Tokens: Provide granular access to specific stacks
Data Protection
- TLS Encryption: All API traffic is encrypted in transit
- Payload Sanitization: Input and output are validated and sanitized
- Credential Isolation: Tool credentials are securely stored and isolated
- Granular Access Control: Each Server can be configured with allowed and blocked in server settings
Best Practices
- Rotate your API keys regularly
- Use the principle of least privilege when configuring tool access
- Implement rate limiting in your application
- Monitor tool usage for unusual patterns
Real-World Use Cases
With ToolRouter’s extensive library of pre-integrated tools, customers are building powerful applications across industries. Here are some real examples of how organizations are leveraging MCP over API:
AI-Powered Customer Service
A major e-commerce company implemented MCP over API to enhance their customer service chatbot:
-
Tools used:
- Shopify for order lookup and product information
- Gmail for communication tracking and follow-ups
- HyperBrowser for navigating the company’s knowledge base
- Exa Search for finding relevant product information
-
Integration pattern: Direct AI agent integration
-
Results: 40% reduction in support ticket escalations, improved customer satisfaction
Creative Workflow Automation
A design agency built an AI assistant that helps streamline their creative processes:
-
Tools used:
- Figma for accessing and modifying design assets
- Notion for project documentation and requirements
- Slack for team communication
- Trello for task management
- Google Drive for file sharing
-
Integration pattern: Hybrid approach
-
Results: Designers saved 15+ hours per week on administrative tasks, allowing them to focus on creative work
Financial Analysis Application
An investment firm built a financial analysis tool that combines AI with specialized financial data tools:
-
Tools used:
- Exa Search for market research
- Airtable for structured data management
- Postgres for complex financial calculations
- Google Sheets for reporting
- Brightdata for web scraping of financial data
-
Integration pattern: Application-mediated integration
-
Results: Analysts could generate comprehensive reports 5x faster than before
Smart Calendar Management
An executive assistant service built an AI solution for managing busy professionals’ schedules:
-
Tools used:
- Google Calendar for scheduling and event management
- Gmail for email communication
- Google Maps for travel time estimation
- Todoist for task management
- VAPI for voice-based interactions
-
Integration pattern: Direct AI agent integration
-
Results: Executives saved 7+ hours weekly on scheduling and emails
Developer Productivity Platform
A software development company created an assistant that helps developers with coding and DevOps tasks:
-
Tools used:
- GitHub for code repository management
- Railway for infrastructure deployment
- Linear for issue tracking
- E2B for code execution environments
- Context7 for pulling up-to-date documentation
-
Integration pattern: Hybrid approach
-
Results: Reduced development cycle times by 30%, improved documentation accuracy
Content Creation Platform
A content marketing platform used MCP over API to supercharge their AI writing assistant:
-
Tools used:
- Perplexity for research
- YouTube for video content analysis
- Discord for community engagement
- X/Twitter for social trend monitoring
- HyperBrowser for content verification
-
Integration pattern: Hybrid approach
-
Results: 3x increase in content production with higher quality and accuracy
These examples represent just a fraction of what’s possible with ToolRouter’s extensive tool library. With over 30 tool connectors available and more being added regularly, you can build powerful AI applications tailored to your specific needs without having to develop and maintain individual integrations.
For a full list of available tools, visit the ToolRouter MCPs.
Getting Started Today
The best part? ToolRouter’s MCP over API is completely free for developers right now. We believe in the transformative potential of this technology and want to make it accessible to innovators everywhere.
To get started:
- Sign up at https://app.toolrouter.ai
- Create your first stack (collection of tools)
- Obtain your client ID and generate an authentication token
For detailed implementation examples and comprehensive documentation, visit docs.toolrouter.ai/api-reference.
What’s Next for MCP Over API
We’re continuously expanding the capabilities of MCP over API. Here’s what’s on our roadmap:
- Enhanced tool discovery: Semantic search for finding the right tools
- Workflow orchestration: Chain multiple tools together for complex tasks
- Custom tool development: Build and publish your own tools to the ecosystem
Conclusion
MCP over API represents a significant evolution in how AI applications can interact with tools and data sources. By combining the semantic richness of MCP with the universal accessibility of REST APIs, we’re enabling a new generation of powerful, scalable AI applications.
Whether you’re building an AI agent, enhancing existing applications with AI capabilities, or creating entirely new AI-powered experiences, MCP over API provides the infrastructure you need to succeed.
Ready to get started? Sign up today at app.toolrouter.ai and join the growing community of developers building the future of AI applications.