Overview
MCP-use provides optional observability integration to help you debug, monitor, and optimize your AI agents. Observability gives you visibility into:- Agent execution flow with detailed step-by-step tracing
- Tool usage patterns and performance metrics
- LLM calls with token usage and costs
- Error tracking and debugging information
- Conversation analytics across sessions
Completely Optional: Observability is entirely opt-in and requires zero code changes to your existing workflows.
Supported Platforms
MCP-use currently integrates with:- Langfuse (v3.38.x+) - Open-source LLM observability with self-hosting options
- Requires:
langfuse@^3.38.0andlangfuse-langchain@^3.38.0 - ⚠️ Use the correct package names:
langfuseandlangfuse-langchain(NOT@langfuse/coreor@langfuse/langchain)
- Requires:
mcp-use if the required environment variables are set.
What Gets Traced
Langfuse automatically captures:- Agent conversations - Full query/response pairs
- LLM calls - Model usage, tokens, and costs
- Tool executions - Which MCP tools were used and their outputs
- Chain executions - Step-by-step execution flow
- Performance metrics - Execution times and step counts
- Error tracking - Failed operations with full context
Example Trace View
Your observability dashboard will show something like:Langfuse Integration
Langfuse is an open-source LLM observability platform with both cloud and self-hosted options.Version Compatibility: MCP-use supports
langfuse and langfuse-langchain version 3.38.x+. While these packages show a peer dependency warning with LangChain 1.0, they work correctly with mcp-use and traces are successfully sent to Langfuse.Setup Langfuse
1. Install Langfuse Packages
Package Names: Use
langfuse and langfuse-langchain (version 3.38.x+).Do NOT use @langfuse/core or @langfuse/langchain - these are incorrect package names and will not work with mcp-use.Peer Dependency Warning: When installing, you may see a peer dependency warning about LangChain versions. This is expected and safe to ignore - the packages work correctly with LangChain 1.0 despite the warning. The Langfuse team is working on updating the peer dependencies for LangChain 1.0 compatibility.
2. Get Your Keys
- Cloud: Sign up at cloud.langfuse.com
- Self-hosted: Follow the self-hosting guide
3. Set Environment Variables
4. Start Using
Langfuse Dashboard Features
- Timeline view - Step-by-step execution flow
- Performance metrics - Response times and costs
- Error analysis - Debug failed operations
- Usage analytics - Tool and model usage patterns
- Session grouping - Track conversations over time
- Self-hosting - Full control over your data
Environment Variables
Required
Optional
Advanced Configuration
Custom Metadata and Tags
You can add custom metadata and tags to your traces for better organization and filtering:Environment Tagging
MCP-use automatically adds environment tags to traces based on theMCP_USE_AGENT_ENV variable:
env:local, env:production, etc., making it easy to filter traces by environment in your Langfuse dashboard.
Custom Callbacks
You can provide custom Langfuse callback handlers or other LangChain callbacks:Disabling Observability
You can disable observability in several ways:1. Via Environment Variable
2. Via Agent Configuration
Advanced Usage
Direct ObservabilityManager Usage
For advanced use cases, you can use theObservabilityManager directly:
Using with Custom LangChain Chains
You can use the observability manager with custom LangChain chains:Serverless Considerations
For serverless environments (AWS Lambda, Vercel, Netlify, etc.), ensure proper shutdown to flush traces:Basic Pattern
AWS Lambda Example
Critical for Serverless: Always call
agent.close() in a finally block to ensure traces are flushed before the serverless function terminates. Otherwise, traces may be lost.Debugging
Enable Debug Logging
Enable debug logging to see observability events:Verify Langfuse Setup
Create a simple test script to verify your Langfuse setup:Troubleshooting
Common Issues
”Package not installed” errors
Make sure you have the correct Langfuse packages installed (version 3.38.x or higher):Common Mistake: Do NOT install
@langfuse/core or @langfuse/langchain. These are incorrect package names and will not work with mcp-use. The correct packages are langfuse and langfuse-langchain (without the @ scope).”API keys not found” warnings
No traces appearing in dashboard
- Verify API keys are correct: Check your Langfuse project settings
- Check observability isn’t disabled: Ensure
MCP_USE_LANGFUSEis not set to"false" - Verify network connectivity: Make sure your application can reach Langfuse servers
- Enable debug logging: Use
Logger.setDebug(true)to see detailed logs - Ensure proper shutdown: Call
await agent.close()to flush traces
Traces not appearing in serverless environments
Self-hosted Langfuse connection issues
For self-hosted Langfuse instances, set theLANGFUSE_HOST or LANGFUSE_BASEURL environment variable:
Privacy & Data Security
What’s Collected
- Queries and responses (for debugging context)
- Tool inputs/outputs (to understand workflows)
- Model metadata (provider, model name, tokens)
- Performance data (execution times, success rates)
- Custom metadata and tags (what you explicitly set)
What’s NOT Collected
- No additional personal information beyond what you send to your LLM
- No API keys or credentials
- No unauthorized data - you control what gets traced
Security Features
- HTTPS encryption for all data transmission (cloud instances)
- Self-hosting options available for full data control
- Easy to disable with environment variables
- Data ownership - you control your observability data
- Granular control - disable per-agent or globally
Benefits
For Development
- Faster debugging - See exactly where workflows fail
- Performance optimization - Identify slow operations
- Cost monitoring - Track LLM usage and expenses
- Rapid iteration - Understand agent behavior quickly
For Production
- Real-time monitoring - Monitor agent performance in production
- Error tracking - Get alerted to failures
- Usage analytics - Understand user interaction patterns
- Cost management - Track and optimize LLM costs
For Teams
- Shared visibility - Everyone can see agent behavior
- Knowledge sharing - Learn from successful workflows
- Collaborative debugging - Debug issues together
- Best practices - Identify and share effective patterns
Getting Help
Need help with observability setup?- Langfuse Documentation: langfuse.com/docs
- MCP-use Documentation: docs.mcp-use.com
- GitHub Issues: github.com/mcp-use/mcp-use/issues
- Example Code: See examples/typescript/client/observability.ts
Pro Tip: Start with basic tracing first to understand your agent’s behavior, then add custom metadata and tags for more sophisticated analysis and filtering in your dashboard.