Common Issues
Solutions to frequently encountered problems with mcp_use
This guide covers the most common issues users encounter when working with mcp_use and their solutions.
Installation Issues
ImportError: No module named ‘mcp_use’
Problem: Cannot import mcp_use after installation.
Solutions:
-
Verify installation in correct environment:
-
Check Python path:
-
Reinstall in correct environment:
LangChain Provider Not Found
Problem: Error importing LangChain providers like langchain_openai
.
Solution: Install the specific LangChain provider:
Configuration Issues
API Key Not Found
Problem: APIKeyNotFoundError
or similar authentication errors.
Solutions:
-
Check environment variables:
-
Verify
.env
file location and contents: -
Ensure
load_dotenv()
is called:
Invalid Configuration File
Problem: JSON parsing errors when loading configuration.
Solutions:
-
Validate JSON syntax:
-
Check file encoding (should be UTF-8):
-
Verify all required fields are present:
MCP Server Issues
Server Not Found
Problem: FileNotFoundError
when trying to start MCP server.
Solutions:
-
Check if server is installed:
-
Test server manually:
-
Use full path in configuration:
Server Connection Timeout
Problem: Server takes too long to start or respond.
Solutions:
-
Increase timeout in agent configuration:
-
Check server logs for issues:
-
Test server independently:
Permission Denied
Problem: Server cannot access files or directories.
Solutions:
-
Check file permissions:
-
Update server configuration with accessible paths:
-
Run with appropriate user permissions:
Agent Runtime Issues
No Tools Available
Problem: Agent reports no tools are available.
Solutions:
-
Verify server connection:
-
Check for server startup errors:
-
Verify server compatibility:
Tool Execution Failures
Problem: Tools fail during execution with unclear errors.
Solutions:
-
Enable verbose logging:
-
Test tools individually:
-
Check tool arguments:
Memory/Performance Issues
Problem: Agent uses too much memory or runs slowly.
Solutions:
-
Enable server manager:
-
Limit concurrent servers:
-
Restrict available tools:
LLM-Specific Issues
Model Not Supporting Tools
Problem: LLM doesn’t support function calling.
Solution: Use a tool-calling capable model:
Rate Limiting
Problem: API rate limits being exceeded.
Solutions:
-
Add delays between requests:
-
Use different model tier:
Model Compatibility Issues
Problem: Some models don’t support tools or don’t work well with MCP servers.
Solution: Many models either don’t support tool calling or have poor compatibility with MCP servers. If you encounter a model that doesn’t behave well, please open a pull request with proof of the issue and add it to the list of incompatible models below.
Known Incompatible Models:
- List will be updated as issues are reported
When reporting model compatibility issues, please include:
- Model name and version
- Specific error messages
- Test case demonstrating the issue
- Expected vs actual behavior
Environment-Specific Issues
Docker/Container Issues
Problem: MCP servers not working in containerized environments.
Solutions:
-
Install Node.js in container:
-
Mount necessary directories:
-
Set proper environment variables:
Windows-Specific Issues
Problem: Path or command issues on Windows.
Solutions:
-
Use Windows-style paths:
-
Use
cmd
for Node.js commands:
Getting Help
If you continue experiencing issues:
- Check logs: Enable debug logging and review error messages
- Search issues: Look through GitHub issues
- Create issue: Report bugs with:
- Complete error messages
- Configuration files (remove API keys)
- Environment details (OS, Python version, etc.)
- Steps to reproduce
Most issues are related to configuration, environment setup, or missing dependencies. Double-check these basics before diving into complex debugging.