- Explore MCP Servers
- mcp-ollama-streamlit-agent
Mcp Ollama Streamlit Agent
What is Mcp Ollama Streamlit Agent
mcp-ollama-streamlit-agent is a fully functional multi-agent chatbot that utilizes the Model Context Protocol (MCP) and the Ollama qwen3:1.7b model, featuring a Streamlit-based frontend. It supports tool calling and integrates various domain-specific utilities such as weather APIs, math evaluation, and CSV dataset analysis.
Use cases
Use cases include: 1) A virtual assistant that provides weather updates, 2) An educational tool that helps students solve math problems, and 3) A data analysis tool that allows users to query and analyze CSV datasets.
How to use
To use mcp-ollama-streamlit-agent, run the Streamlit application by executing the app.py file. Users can interact with the chatbot through the frontend interface, where they can input queries and receive responses powered by the MCP and Ollama model.
Key features
Key features include: 1) Weather Tools for fetching alerts and forecasts, 2) Math Evaluator for safe arithmetic expression evaluation, and 3) Dataset Inspector for performing analysis on CSV files, including summary statistics and NLP-powered queries.
Where to use
mcp-ollama-streamlit-agent can be used in various fields such as education for tutoring, customer service for automated responses, data analysis for insights from datasets, and weather forecasting applications.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Mcp Ollama Streamlit Agent
mcp-ollama-streamlit-agent is a fully functional multi-agent chatbot that utilizes the Model Context Protocol (MCP) and the Ollama qwen3:1.7b model, featuring a Streamlit-based frontend. It supports tool calling and integrates various domain-specific utilities such as weather APIs, math evaluation, and CSV dataset analysis.
Use cases
Use cases include: 1) A virtual assistant that provides weather updates, 2) An educational tool that helps students solve math problems, and 3) A data analysis tool that allows users to query and analyze CSV datasets.
How to use
To use mcp-ollama-streamlit-agent, run the Streamlit application by executing the app.py file. Users can interact with the chatbot through the frontend interface, where they can input queries and receive responses powered by the MCP and Ollama model.
Key features
Key features include: 1) Weather Tools for fetching alerts and forecasts, 2) Math Evaluator for safe arithmetic expression evaluation, and 3) Dataset Inspector for performing analysis on CSV files, including summary statistics and NLP-powered queries.
Where to use
mcp-ollama-streamlit-agent can be used in various fields such as education for tutoring, customer service for automated responses, data analysis for insights from datasets, and weather forecasting applications.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
🧠 MCP + Ollama Streamlit Chatbot
This repository contains a fully functional multi-agent chatbot powered by the Model Context Protocol (MCP), Ollama with the qwen3:1.7b model, and a Streamlit-based frontend. The chatbot supports tool calling and integrates domain-specific utilities like weather APIs, math evaluation, and CSV dataset analysis.
📁 Project Structure
. ├── app.py # Streamlit frontend interface ├── client.py # Async MCP client with Ollama integration and tool handling ├── server.py # MCP-compatible server with weather, math, and dataset tools ├── data/ │ └── dataset.csv # Sample CSV file for dataset analysis ├── .env # Environment variables (MCP_SSE_URL, etc.)
🧰 Features & Tools
The assistant supports the following built-in tools via MCP server:
- 🌤️ Weather Tools – Fetch alerts and forecasts using the National Weather Service API
- ➗ Math Evaluator – Safe evaluation of arithmetic expressions
- 📊 Dataset Inspector – Summary statistics, shape, and NLP-powered queries on a local
dataset.csvfile
flowchart TD subgraph Streamlit_UI A1[User Prompt] A2[Display Chat History] A3[Streamlit App - app_py] end subgraph MCP_Client B1[Connect to SSE Server] B2[Process Query] B3[Call Ollama API] B4[Handle Tool Calls] end subgraph MCP_Server C1[Weather Alerts Tool] C2[Forecast Tool] C3[Math Evaluation Tool] C4[Dataset Analysis Tool] C5[Dataset Query Tool] end subgraph Ollama_Model D1[qwen3 1_7b Model] end A1 --> A3 A3 --> B2 A2 --> A3 B2 --> B3 B3 --> D1 D1 --> B4 B4 -->|Tool Call| C1 B4 -->|Tool Call| C2 B4 -->|Tool Call| C3 B4 -->|Tool Call| C4 B4 -->|Tool Call| C5 C1 --> B2 C2 --> B2 C3 --> B2 C4 --> B2 C5 --> B2
🔧 Requirements
Ensure you have the following installed:
- Python 3.9+
- Ollama with the
qwen3:1.7bmodel available locally - MCP library (see installation below)
- Streamlit
- Uvicorn for ASGI server
- A
.envfile with MCP server URL defined:MCP_SSE_URL=http://localhost:8080/sse
Python Packages
You can install all required packages via:
pip install -r requirements.txt
If you don’t have a requirements.txt, use:
pip install streamlit uvicorn httpx python-dotenv pandas scikit-learn mcp
▶️ How to Run
1. Launch the MCP Server (Tool Provider)
Run the server to expose SSE-compatible endpoints:
python server.py --host 0.0.0.0 --port 8080
This will start a FastAPI-compatible MCP server exposing tools on:
http://localhost:8080/sse
2. Start the Streamlit Frontend
In another terminal:
streamlit run app.py
This will open the chat interface in your browser at:
http://localhost:8501
📦 Ollama Model Setup
Install and run Ollama:
ollama pull qwen3:1.7b ollama run qwen3:1.7b
Ensure the model is loaded and responding at:
http://localhost:11434/api/chat
🧪 Supported Use Cases
Here are some queries you can try:
Weather
- What’s the weather in San Francisco?
- Are there any alerts in NY?
Math
- What is 2 + 3 * 4?
- Calculate the square root of 81
Dataset Analysis
- What are the columns in the dataset?
- How many records are in the file?
- Do any descriptions mention “cloud” or “AI”?
🧼 Resetting State
You can reset the chat history using the sidebar button in the Streamlit UI.
📁 Notes
- Make sure
data/dataset.csvexists if using dataset tools. - Ensure
MCP_SSE_URLin.envmatches your server setup. - This system uses
asyncio.run_coroutine_threadsafe()to allow asynchronous tool execution within Streamlit’s synchronous model.
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










