- Explore MCP Servers
- tupac
Tupac
What is Tupac
tupac is a terminal MCP client that acts as a wrapper around the OpenAI responses API, allowing users to quickly create LLM applications by specifying MCP configurations and system prompts.
Use cases
Use cases for tupac include building chatbots, creating automated response systems, and developing applications that require quick access to information through natural language queries.
How to use
To use tupac, run the command ‘uvx tupac <config_file>
Key features
Key features of tupac include support for MCP tools and resources, the ability to use environment variables in configuration files, and a straightforward configuration format that adheres to the standard MCP schema.
Where to use
tupac can be used in various fields where natural language processing and AI-driven responses are needed, such as customer support, information retrieval, and educational tools.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Tupac
tupac is a terminal MCP client that acts as a wrapper around the OpenAI responses API, allowing users to quickly create LLM applications by specifying MCP configurations and system prompts.
Use cases
Use cases for tupac include building chatbots, creating automated response systems, and developing applications that require quick access to information through natural language queries.
How to use
To use tupac, run the command ‘uvx tupac <config_file>
Key features
Key features of tupac include support for MCP tools and resources, the ability to use environment variables in configuration files, and a straightforward configuration format that adheres to the standard MCP schema.
Where to use
tupac can be used in various fields where natural language processing and AI-driven responses are needed, such as customer support, information retrieval, and educational tools.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
tupac
A GPT wrapper with MCP support. It’s a thin layer around the OpenAI responses API
with functions being specified as MCP config.
You can write a simple “LLM app” very quickly, by specifying MCP config and a system prompt.
MCP functionality supported:
- ✅ tools
- ✅ resources — but only as far as they’re being returned from tools. No listing or fetching.
Nothing else. It’s what I consider to be an absolute bare-bones MCP app.
Usage: LLM app
uvx tupac configs/web-search.json "When are we getting to Mars?"
You can get the config file by cloning the repo, or just copy/paste, or make your own.
It’s not a bad idea to add a bash alias for some of these:
alias ws-agent="uvx tupac ~/.config/tupac/web-search.json"
Configuration files may contain ${VARNAME} placeholders which are expanded
from the environment before parsing. Environment variables can also be loaded
from a .env file via python-dotenv. See configs/web-search.json for an
example using ${EXA_API_KEY}.
Configuration format follows the standard MCP schema:
{
"instructions": "Use search to answer questions.",
"model": "o3",
"mcpServers": {
"exa": {
"type": "url",
"url": "https://mcp.exa.ai/mcp?exaApiKey=${EXA_API_KEY}"
}
}
}
Environment Variables
You can use that ${EXA_API_KEY} syntax to reference environment variables in the config file. It
does load .env files, but it loads it from the current directory, wherever you are.
Required environment variables
OPENAI_API_KEY
Variables required to run configs/web-search.json:
EXA_API_KEY— find it here
Happy building!
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










