MCP ExplorerExplorer

Ollama Mcp Chat

@godstaleon a year ago
2 MIT
FreeCommunity
AI Systems
Ollama MCP Chat is a desktop chatbot application that integrates Ollama's local LLM models with MCP (Model Context Protocol) servers

Overview

What is Ollama Mcp Chat

Ollama MCP Chat is a desktop chatbot application that integrates Ollama’s local LLM models with MCP (Model Context Protocol) servers, enabling users to create interactive AI applications with a graphical user interface.

Use cases

Use cases include creating personalized chatbots for customer service, developing interactive learning applications, and building tools for data analysis and visualization.

How to use

To use Ollama MCP Chat, clone the repository, install the necessary dependencies using ‘uv’, set up Ollama and download a model, configure the MCP server if needed, and run the application using ‘uv run main.py’.

Key features

Key features include running Ollama LLM models locally for free, integrating various tools via MCP servers, managing chat history, providing real-time streaming responses, and offering an intuitive desktop GUI based on PySide6.

Where to use

Ollama MCP Chat can be used in various fields such as software development, AI application development, customer support automation, and educational tools.

Content

Ollama MCP Chat

Ollama MCP Chat is a desktop chatbot application that integrates Ollama’s local LLM models with MCP (Model Context Protocol) servers, supporting various tool calls and extensible features. It provides a GUI based on Python and PySide6, and allows you to freely extend its capabilities via MCP servers.

This project can be very useful as base code for developers who want to create AI applications with GUI in Python.

Key Features

  • Run Ollama LLM models locally for free
  • Integrate and call various tools via MCP servers
  • Manage and save chat history
  • Real-time streaming responses and tool call results
  • Intuitive desktop GUI (PySide6-based)
  • GUI support for adding, editing, and removing MCP servers

System Requirements

  • Python 3.12 or higher
  • Ollama installed (for local LLM execution)
  • uv (recommended for package management)
  • MCP server (can be implemented or use external MCP servers)
  • smithery.ai (recommended for MCP repository)

Installation

  1. Clone the repository
git clone https://github.com/your-repo/ollama-mcp-chat.git
cd ollama-mcp-chat
  1. Install uv (if not installed)
# Using pip
pip install uv

# Or using curl (Unix-like systems)
curl -LsSf https://astral.sh/uv/install.sh | sh

# Or using PowerShell (Windows)
powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
  1. Install dependencies
# Install dependencies
uv sync
  1. Install Ollama and download a model
# Install Ollama (see https://ollama.ai for details)
ollama pull <model-name>
  1. MCP server configuration (optional)
  • Add MCP server information to the mcp_config.json file
  • Example:
{
  "mcpServers": {
    "weather": {
      "command": "python",
      "args": [
        "./mcp_server/mcp_server_weather.py"
      ],
      "transport": "stdio"
    }
  }
}

How to Run

uv run main.py
  • The GUI will launch, and you can start chatting and using MCP tools.

Main Files

  • ui/chat_window.py: Main GUI window, handles chat/history/settings/server management
  • agent/chat_history.py: Manages and saves/loads chat history
  • worker.py: Handles asynchronous communication with LLM and MCP servers
  • agent/llm_ollama.py: Integrates Ollama LLM and MCP tools, handles streaming responses
  • mcp_server/mcp_manager.py: Manages and validates MCP server configuration files

Extending MCP Servers

  1. Add new MCP server information to mcp_config.json
  2. Implement and prepare the MCP server executable
  3. Restart the application and check the MCP server list in the GUI

Chat History

  • All conversations are automatically saved to chat_history.json
  • You can load previous chats or start a new chat from the GUI

Exit Commands

  • Type quit, exit, or bye in the program to exit

Notes

  • Basic LLM chat works even without MCP server configuration
  • Be mindful of your PC’s performance and memory usage, especially with large LLM models
  • MCP servers can be implemented in Python, Node.js, or other languages, and external MCP servers are also supported

License

MIT License

Tools

No tools

Comments

Recommend MCP Servers

View All MCP Servers