- Explore MCP Servers
- universal-mcp-ui
Universal Mcp Ui
What is Universal Mcp Ui
universal-mcp-ui is a protocol-level command-line interface (CLI) designed to interact with a Model Context Protocol (MCP) server, enabling users to send commands, query data, and manage various resources provided by the server.
Use cases
Use cases for universal-mcp-ui include developing AI applications, querying model outputs, testing different AI providers, and managing resources in a dynamic environment.
How to use
To use universal-mcp-ui, clone the repository, install the necessary dependencies, and run the client with the appropriate command-line arguments to connect to the desired MCP server and provider.
Key features
Key features include protocol-level communication with the MCP server, dynamic tool and resource exploration, and support for multiple providers and models, such as OpenAI and Ollama.
Where to use
universal-mcp-ui can be used in various fields that require interaction with AI models, data querying, and resource management, particularly in software development, data science, and research.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Universal Mcp Ui
universal-mcp-ui is a protocol-level command-line interface (CLI) designed to interact with a Model Context Protocol (MCP) server, enabling users to send commands, query data, and manage various resources provided by the server.
Use cases
Use cases for universal-mcp-ui include developing AI applications, querying model outputs, testing different AI providers, and managing resources in a dynamic environment.
How to use
To use universal-mcp-ui, clone the repository, install the necessary dependencies, and run the client with the appropriate command-line arguments to connect to the desired MCP server and provider.
Key features
Key features include protocol-level communication with the MCP server, dynamic tool and resource exploration, and support for multiple providers and models, such as OpenAI and Ollama.
Where to use
universal-mcp-ui can be used in various fields that require interaction with AI models, data querying, and resource management, particularly in software development, data science, and research.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
Model Context Protocol CLI
This repository contains a protocol-level CLI designed to interact with a Model Context Protocol server. The client allows users to send commands, query data, and interact with various resources provided by the server.
Features
- Protocol-level communication with the MCP Server.
- Dynamic tool and resource exploration.
- Support for multiple providers and models:
- Providers: OpenAI, Ollama.
- Default models:
gpt-4ofor OpenAI,qwen2.5-coderfor Ollama.
Prerequisites
- Python 3.8 or higher.
- Required dependencies (see Installation)
- If using ollama you should have ollama installed and running.
- If using openai you should have an api key set in your environment variables (OPENAI_API_KEY=yourkey)
Installation
- Clone the repository:
git clone https://github.com/chrishayuk/mcp-cli
cd mcp-cli
- Install UV:
pip install uv
- Resynchronize dependencies:
uv sync --reinstall
Usage
To start the client and interact with the SQLite server, run the following command:
uv run mcp-cli --server sqlite
Command-line Arguments
--server: Specifies the server configuration to use. Required.--config-file: (Optional) Path to the JSON configuration file. Defaults toserver_config.json.--provider: (Optional) Specifies the provider to use (openaiorollama). Defaults toopenai.--model: (Optional) Specifies the model to use. Defaults depend on the provider:gpt-4ofor OpenAI.llama3.2for Ollama.
Examples
Run the client with the default OpenAI provider and model:
uv run mcp-cli --server sqlite
Run the client with a specific configuration and Ollama provider:
uv run mcp-cli --server sqlite --provider ollama --model llama3.2
Interactive Mode
The client supports interactive mode, allowing you to execute commands dynamically. Type help for a list of available commands or quit to exit the program.
Supported Commands
ping: Check if the server is responsive.list-tools: Display available tools.list-resources: Display available resources.list-prompts: Display available prompts.chat: Enter interactive chat mode.clear: Clear the terminal screen.help: Show a list of supported commands.quit/exit: Exit the client.
Chat Mode
To enter chat mode and interact with the server:
uv run mcp-cli --server sqlite
In chat mode, you can use tools and query the server interactively. The provider and model used are specified during startup and displayed as follows:
Entering chat mode using provider ‘ollama’ and model ‘llama3.2’…
Using OpenAI Provider:
If you wish to use openai models, you should
- set the
OPENAI_API_KEYenvironment variable before running the client, either in .env or as an environment variable.
Contributing
Contributions are welcome! Please open an issue or submit a pull request with your proposed changes.
License
This project is licensed under the MIT License.
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










