This guide will get you started with mcp-use in under 5 minutes. We’ll cover installation, basic configuration, and running your first agent.
Overview
| Section | Description |
|---|---|
| MCP Server | Set up MCP servers standalone or in an existing Express server |
| MCP Agent | Install and initialize the MCP Agent component |
MCP Server
The fastest way to scaffold a new MCP server is to use thecreate-mcp-use-app command.
- A complete TypeScript project structure
- Pre-configured build tools and dev server
- Example tools and resources to get you started
- All necessary dependencies installed
To explore available templates and options, run:
Project Structure
After creation, your project will have this structure:Running Your Server
Start the development server:Next Steps
- Add Tools: Create new tools in the
src/tools/directory - Configure Resources: Set up resources that your server exposes
- Deploy: Follow deployment guides for your hosting platform
MCP Agent
Installing LangChain Providers
mcp-use works with various LLM providers through LangChain. You’ll need to install the appropriate LangChain provider package for your chosen LLM:Tool Calling Required: Only models with tool calling capabilities can be used with mcp-use. Make sure your chosen model supports function calling or tool use.
For other providers, check the LangChain chat models documentation
Environment Setup
Set up your environment variables in a
.env file for secure API key management:.env
Your First Agent
Here’s a simple example to get you started:Configuration Options
You can also load servers configuration from a config file:browser_mcp.json):
For multi-server setups, tool restrictions, and advanced configuration options, see the Configuration Overview.
Available MCP Servers
mcp-use supports any MCP server. Check out the Awesome MCP Servers list for available options.Streaming Agent Output
Stream agent responses as they’re generated:Uses LangChain’s streaming API. See streaming documentation for more details.
Next Steps
Configuration
Complete configuration guide covering client setup and agent customization
LLM Integration
Discover all supported LLM providers and optimization tips
Examples
Explore real-world examples and use cases
Need Help? Join our community discussions on GitHub or check out the comprehensive examples in our repository!