- Explore MCP Servers
- mcp-inception
Mcp Inception
What is Mcp Inception
mcp-inception is a TypeScript-based MCP server that allows one MCP client to call another MCP client. It facilitates task delegation, offloading of context windows, and enables parallel and map-reduce execution of tasks.
Use cases
Use cases include querying multiple LLMs for information simultaneously, processing large datasets in parallel, and combining results from different sources to generate comprehensive outputs.
How to use
To use mcp-inception, set up the server and client as per the README instructions. Utilize the provided tools like execute_mcp_client, execute_parallel_mcp_client, and execute_map_reduce_mcp_client to perform tasks and queries efficiently.
Key features
Key features include the ability to offload context windows, delegate tasks, execute queries in parallel, and process multiple items with map-reduce functionality. It also integrates with mcp-client-cli for enhanced performance.
Where to use
mcp-inception can be used in various fields such as natural language processing, data analysis, and any application requiring efficient task delegation and processing across multiple models.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Mcp Inception
mcp-inception is a TypeScript-based MCP server that allows one MCP client to call another MCP client. It facilitates task delegation, offloading of context windows, and enables parallel and map-reduce execution of tasks.
Use cases
Use cases include querying multiple LLMs for information simultaneously, processing large datasets in parallel, and combining results from different sources to generate comprehensive outputs.
How to use
To use mcp-inception, set up the server and client as per the README instructions. Utilize the provided tools like execute_mcp_client, execute_parallel_mcp_client, and execute_map_reduce_mcp_client to perform tasks and queries efficiently.
Key features
Key features include the ability to offload context windows, delegate tasks, execute queries in parallel, and process multiple items with map-reduce functionality. It also integrates with mcp-client-cli for enhanced performance.
Where to use
mcp-inception can be used in various fields such as natural language processing, data analysis, and any application requiring efficient task delegation and processing across multiple models.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
Disclaimer
Ok this is a difficult one. Will take some setting up unfortunately.
However, if you manage to make this more straightforward, please send me PR’s.
mcp-inception MCP Server
Call another mcp client from your mcp client. Delegate tasks, offload context windows. An agent for your agent!
This is a TypeScript-based MCP server that implements a simple LLM query system.
- MCP Server and Client in one
- Made with use of mcp-client-cli
- Offload context windows
- Delegate tasks
- Parallel and map-reduce execution of tasks
Features
Tools
execute_mcp_client- Ask a question to a separate LLM, ignore all the intermediate steps it takes when querying it’s tools, and return the output.- Takes question as required parameters
- Returns answer, ignoring all the intermediate context
- execute_parallel_mcp_client - Takes a list of inputs and a main prompt, and executes the prompt in parallel for each string in the input.
E.G. get the time of 6 major cities right now - London, Paris, Tokyo, Rio, New York, Sidney.- takes main prompt “What is the time in this city?”
- takes list of inputs, London Paris etc
- runs the prompt in parallel for each input
- note: wait for this before using this feature
execute_map_reduce_mcp_client- Process multiple items in parallel and then sequentially reduce the results to a single output.- Takes
mapPromptwith{item}placeholder for individual item processing - Takes
reducePromptwith{accumulator}and{result}placeholders for combining results - Takes list of
itemsto process - Optional
initialValuefor the accumulator - Processes items in parallel, then sequentially reduces results
- Example use case: Analyze multiple documents, then synthesize key insights from all documents into a summary
- Takes
Development
Dependencies:
- Install mcp-client-cli
- Also install the config file, and the mcp servers it needs in
~/.llm/config.json
- Also install the config file, and the mcp servers it needs in
- create a bash file somewhere that activates the venv and executes the
llmexecutable
#!/bin/bash
source ./venv/bin/activate
llm --no-confirmations
install package
Install dependencies:
npm install
Build the server:
npm run build
For development with auto-rebuild:
npm run watch
Installation
To use with Claude Desktop, add the server config:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
Debugging
Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:
npm run inspector
The Inspector will provide a URL to access debugging tools in your browser.
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










