Overview
MCP-use provides optional observability integration to help you debug, monitor, and optimize your AI agents. Observability gives you visibility into:- Agent execution flow with detailed step-by-step tracing
- Tool usage patterns and performance metrics
- LLM calls with token usage and costs
- Error tracking and debugging information
- Conversation analytics across sessions
Completely Optional: Observability is entirely opt-in and requires zero code changes to your existing workflows.
Supported Platforms
MCP-use integrates with two leading observability platforms:- Langfuse - Open-source LLM observability with self-hosting options
- Laminar - Comprehensive AI application monitoring platform
- LangSmith - LangChain’s observability platform
What Gets Traced
These platforms automatically capture:- Agent conversations - Full query/response pairs
- LLM calls - Model usage, tokens, and costs
- Tool executions - Which MCP tools were used and their outputs
- Performance metrics - Execution times and step counts
- Error tracking - Failed operations with full context
Example Trace View
Your observability dashboard will show something like:Langfuse Integration
Langfuse is an open-source LLM observability platform with both cloud and self-hosted options.Setup Langfuse
1. Install Langfuse
2. Get Your Keys
- Cloud: Sign up at cloud.langfuse.com
- Self-hosted: Follow the self-hosting guide
3. Set Environment Variables
4. Start Using
Langfuse Dashboard Features
- Timeline view - Step-by-step execution flow
- Performance metrics - Response times and costs
- Error analysis - Debug failed operations
- Usage analytics - Tool and model usage patterns
- Session grouping - Track conversations over time
- Self-hosting - Full control over your data
Environment Variables
Laminar Integration
Laminar provides comprehensive AI application monitoring with advanced analytics.Setup Laminar
1. Install Laminar
2. Get Your API Key
- Sign up at lmnr.ai
- Create a new project
- Copy your project API key
3. Set Environment Variable
4. Start Using
Laminar Features
- Advanced tracing - Detailed execution flow visualization
- Real-time monitoring - Live performance metrics
- Cost tracking - LLM usage and billing analytics
- Error analysis - Comprehensive error tracking and debugging
- Team collaboration - Shared dashboards and insights
- Production monitoring - Built for scale
Environment Variables
Additional Information
Privacy & Data Security
What’s Collected
- Queries and responses (for debugging context)
- Tool inputs/outputs (to understand workflows)
- Model metadata (provider, model name, tokens)
- Performance data (execution times, success rates)
What’s NOT Collected
- No personal information beyond what you send to your LLM
- No API keys or credentials
- No unauthorized data - you control what gets traced
Security Features
- HTTPS encryption for all data transmission
- Self-hosting options available (Langfuse)
- Easy to disable with environment variables
- Data ownership - you control your observability data
Disabling Observability
Temporarily Disable
Troubleshooting
Common Issues
“Package not installed” errors- Verify your API keys are correct
- Check that observability isn’t disabled (
MCP_USE_LANGFUSE
orMCP_USE_LAMINAR
set to “false”) - Check network connectivity to the platform
- Enable debug logging:
logging.basicConfig(level=logging.DEBUG)
LANGFUSE_HOST
environment variable:
LangSmith Integration
Advanced Debugging: LangChain offers LangSmith, a powerful tool for debugging agent behavior that integrates seamlessly with mcp-use.
1
Sign Up
Visit smith.langchain.com and create an account
2
Get API Keys
After login, you’ll receive environment variables to add to your
.env
file3
Visualize
You’ll be able to visualize agent behavior, tool calls, and decision-making processes on their platform
LangSmith provides detailed traces of your agent’s execution, making it easier to understand complex multi-step workflows.
Benefits
For Development
- Faster debugging - See exactly where workflows fail
- Performance optimization - Identify slow operations
- Cost monitoring - Track LLM usage and expenses
For Production
- Real-time monitoring - Monitor agent performance
- Error tracking - Get alerted to failures
- Usage analytics - Understand user interaction patterns
For Teams
- Shared visibility - Everyone can see agent behavior
- Knowledge sharing - Learn from successful workflows
- Collaborative debugging - Debug issues together
Getting Help
Need help with observability setup?- Langfuse Documentation: langfuse.com/docs
- Laminar Documentation: lmnr.ai/docs
- LangSmith Documentation: smith.langchain.com
- MCP-use Issues: GitHub Issues
Pro Tip: Start with one platform first to get familiar with observability, then add the second platform if you need different features or perspectives.