- Explore MCP Servers
- mcp-use
Mcp Use
What is Mcp Use
mcp-use is an open-source Unified MCP Client Library designed to connect any LLM (Large Language Model) to any MCP server using TypeScript/Node.js, enabling the creation of custom agents with tool access without relying on closed-source dependencies.
Use cases
Use cases for mcp-use include building chatbots, automating data processing tasks, creating interactive web applications, and developing custom AI agents that require access to multiple tools and servers.
How to use
To use mcp-use, developers can create an MCP-capable agent by writing just a few lines of TypeScript code, leveraging the LangChain.js framework to connect to various tools such as web browsing and file operations.
Key features
Key features include ease of use for quick agent creation, flexibility to work with any LangChain.js-supported LLM, direct HTTP/SSE connection to MCP servers, dynamic server selection, multi-server support, tool restrictions for safety, and the ability to build custom agents.
Where to use
mcp-use can be utilized in various fields such as AI development, automation, web applications, and any scenario where integration with LLMs and MCP servers is required.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Mcp Use
mcp-use is an open-source Unified MCP Client Library designed to connect any LLM (Large Language Model) to any MCP server using TypeScript/Node.js, enabling the creation of custom agents with tool access without relying on closed-source dependencies.
Use cases
Use cases for mcp-use include building chatbots, automating data processing tasks, creating interactive web applications, and developing custom AI agents that require access to multiple tools and servers.
How to use
To use mcp-use, developers can create an MCP-capable agent by writing just a few lines of TypeScript code, leveraging the LangChain.js framework to connect to various tools such as web browsing and file operations.
Key features
Key features include ease of use for quick agent creation, flexibility to work with any LangChain.js-supported LLM, direct HTTP/SSE connection to MCP servers, dynamic server selection, multi-server support, tool restrictions for safety, and the ability to build custom agents.
Where to use
mcp-use can be utilized in various fields such as AI development, automation, web applications, and any scenario where integration with LLMs and MCP servers is required.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
Unified MCP Client Library
🌐 MCP Client is the open-source way to connect any LLM to any MCP server in TypeScript/Node.js, letting you build custom agents with tool access without closed-source dependencies.
💡 Let developers easily connect any LLM via LangChain.js to tools like web browsing, file operations, 3D modeling, and more.
✨ Key Features
| Feature | Description |
|---|---|
| 🔄 Ease of use | Create an MCP-capable agent in just a few lines of TypeScript. |
| 🤖 LLM Flexibility | Works with any LangChain.js-supported LLM that supports tool calling. |
| 🌐 HTTP Support | Direct SSE/HTTP connection to MCP servers. |
| ⚙️ Dynamic Server Selection | Agents select the right MCP server from a pool on the fly. |
| 🧩 Multi-Server Support | Use multiple MCP servers in one agent. |
| 🛡️ Tool Restrictions | Restrict unsafe tools like filesystem or network. |
| 🔧 Custom Agents | Build your own agents with LangChain.js adapter or implement new adapters. |
🚀 Quick Start
Requirements
- Node.js 22.0.0 or higher
- npm, yarn, or pnpm (examples use pnpm)
Installation
# Install from npm
npm install mcp-use
# LangChain.js and your LLM provider (e.g., OpenAI)
npm install langchain @langchain/openai dotenv
Create a .env:
OPENAI_API_KEY=your_api_key
Basic Usage
import { ChatOpenAI } from '@langchain/openai'
import { MCPAgent, MCPClient } from 'mcp-use'
import 'dotenv/config'
async function main() {
// 1. Configure MCP servers
const config = {
mcpServers: {
playwright: { command: 'npx', args: ['@playwright/mcp@latest'] }
}
}
const client = MCPClient.fromDict(config)
// 2. Create LLM
const llm = new ChatOpenAI({ modelName: 'gpt-4o' })
// 3. Instantiate agent
const agent = new MCPAgent({ llm, client, maxSteps: 20 })
// 4. Run query
const result = await agent.run('Find the best restaurant in Tokyo using Google Search')
console.log('Result:', result)
}
main().catch(console.error)
📂 Configuration File
You can store servers in a JSON file:
{
"mcpServers": {
"playwright": {
"command": "npx",
"args": [
"@playwright/mcp@latest"
]
}
}
}
Load it:
import { MCPClient } from 'mcp-use'
const client = MCPClient.fromConfigFile('./mcp-config.json')
🔄 Multi-Server Example
const config = {
mcpServers: {
airbnb: { command: 'npx', args: ['@openbnb/mcp-server-airbnb'] },
playwright: { command: 'npx', args: ['@playwright/mcp@latest'] }
}
}
const client = MCPClient.fromDict(config)
const agent = new MCPAgent({ llm, client, useServerManager: true })
await agent.run('Search Airbnb in Barcelona, then Google restaurants nearby')
🔒 Tool Access Control
const agent = new MCPAgent({
llm,
client,
disallowedTools: ['file_system', 'network']
})
👥 Contributors
|
Zane |
Pietro Zullo |
📜 License
MIT © Zane
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










