This guide will get you started with mcp-use in under 5 minutes. We’ll cover installation, basic configuration, and running your first agent.
Overview
| Section | Description |
|---|
| MCP Server | Set up MCP servers standalone or in an existing Express server |
| MCP Agent | Install and initialize the MCP Agent component |
MCP Server
The fastest way to scaffold a new MCP server is to use the create-mcp-use-app command.
npx create-mcp-use-app@latest my-mcp-server
cd my-mcp-server
npm run dev
This command will create a new MCP server with:
- A complete TypeScript project structure
- Pre-configured build tools and dev server
- Example tools and resources to get you started
- All necessary dependencies installed
To explore available templates and options, run:npx create-mcp-use-app --help
Project Structure
After creation, your project will have this structure:
my-mcp-server/
├── src/
│ ├── index.ts # Main server entry point
│ └── tools/ # Your custom tools
├── package.json
├── tsconfig.json
└── README.md
Running Your Server
Start the development server:
Your MCP server will be available and ready to accept connections from MCP clients.
Next Steps
- Add Tools: Create new tools in the
src/tools/ directory
- Configure Resources: Set up resources that your server exposes
- Deploy: Follow deployment guides for your hosting platform
MCP Agent
Installing LangChain Providers
mcp-use works with various LLM providers through LangChain. You’ll need to install the appropriate LangChain provider package for your chosen LLM:
Tool Calling Required: Only models with tool calling capabilities can be used with mcp-use. Make sure your chosen model supports function calling or tool use.
Environment Setup
Set up your environment variables in a .env file for secure API key management:
# LLM Provider Keys (set the ones you want to use)
OPENAI_API_KEY=your_api_key_here
ANTHROPIC_API_KEY=your_api_key_here
GROQ_API_KEY=your_api_key_here
GOOGLE_API_KEY=your_api_key_here
Your First Agent
Here’s a simple example to get you started:
import { ChatOpenAI } from '@langchain/openai'
import { config } from 'dotenv'
import { MCPAgent, MCPClient } from 'mcp-use'
async function main() {
// Load environment variables
config()
// Create configuration object
const configuration = {
mcpServers: {
playwright: {
command: 'npx',
args: ['@playwright/mcp@latest'],
env: {
DISPLAY: ':1'
}
}
}
}
// Create MCPClient from configuration object
const client = new MCPClient(configuration)
// Create LLM
const llm = new ChatOpenAI({ model: 'gpt-4o' })
// Create agent with the client
const agent = new MCPAgent({
llm,
client,
maxSteps: 30
})
// Run the query
const result = await agent.run(
'Find the best restaurant in San Francisco USING GOOGLE SEARCH'
)
console.log(`\nResult: ${result}`)
// Clean up
await client.closeAllSessions()
}
main().catch(console.error)
Configuration Options
You can also load servers configuration from a config file:
import { loadConfigFile } from 'mcp-use'
const config = await loadConfigFile("browser_mcp.json")
const client = new MCPClient(config)
Example configuration file (browser_mcp.json):
{
"mcpServers": {
"playwright": {
"command": "npx",
"args": ["@playwright/mcp@latest"],
"env": {
"DISPLAY": ":1"
}
}
}
}
For multi-server setups, tool restrictions, and advanced configuration options, see the Configuration Overview.
Available MCP Servers
mcp-use supports any MCP server. Check out the Awesome MCP Servers list for available options.
Streaming Agent Output
Stream agent responses as they’re generated:
// Stream intermediate steps
for await (const step of agent.stream("your query here")) {
console.log(`Tool: ${step.action.tool}`)
console.log(`Result: ${step.observation}`)
}
// Or stream token-level events
for await (const event of agent.streamEvents("your query here")) {
if (event.event === 'on_chat_model_stream') {
process.stdout.write(event.data?.chunk?.text || '')
}
}
Next Steps
Need Help? Join our community discussions on GitHub or check out the comprehensive examples in our repository!