LLM Integration Guide
mcp_use supports integration with any Language Learning Model (LLM) that is compatible with LangChain. This guide covers how to use different LLM providers with mcp_use and emphasizes the flexibility to use any LangChain-supported model.Key Requirement: Your chosen LLM must support tool calling (also known as function calling) to work with MCP tools. Most modern LLMs support this feature.
Universal LLM Support
mcp_use leverages LangChain’s architecture to support any LLM that implements the LangChain interface. This means you can use virtually any model from any provider, including:OpenAI
GPT-4, GPT-4o, GPT-3.5 Turbo
Anthropic
Claude 3.5 Sonnet, Claude 3 Opus, Claude 3 Haiku
Gemini Pro, Gemini Flash, PaLM
Open Source
Llama, Mistral, CodeLlama via various providers
Popular Provider Examples
OpenAI
Anthropic Claude
Google Gemini
Groq (Fast Inference)
Local Models with Ollama
Model Requirements
Tool Calling Support
For MCP tools to work properly, your chosen model must support tool calling. Most modern LLMs support this: ✅ Supported Models:- OpenAI: GPT-4, GPT-4o, GPT-3.5 Turbo
- Anthropic: Claude 3+ series
- Google: Gemini Pro, Gemini Flash
- Groq: Llama 3.1, Mixtral models
- Most recent open-source models
- Basic completion models without tool calling
- Very old model versions
- Models without function calling capabilities
Checking Tool Support
You can verify if a model supports tools:Model Configuration Tips
Temperature Settings
Different tasks benefit from different temperature settings:Model-Specific Parameters
Each provider has unique parameters you can configure:Cost Optimization
Choosing Cost-Effective Models
Consider your use case when selecting models:Use Case | Recommended Models | Reason |
---|---|---|
Development/Testing | GPT-3.5 Turbo, Claude Haiku | Lower cost, good performance |
Production/Complex | GPT-4o, Claude Sonnet | Best performance |
High Volume | Groq models | Fast inference, competitive pricing |
Privacy/Local | Ollama models | No API costs, data stays local |
Token Management
Environment Setup
Always use environment variables for API keys:Advanced Integration
Custom Model Wrappers
You can create custom wrappers for specialized models:Model Switching
Switch between models dynamically:Troubleshooting
Common Issues
- “Model doesn’t support tools”: Ensure your model supports function calling
- API key errors: Check environment variables and API key validity
- Rate limiting: Implement retry logic or use different models
- Token limits: Adjust max_tokens or use models with larger context windows