MCP ExplorerExplorer

Mcp Ollama File Agent

@rajeevchandraon a year ago
2 NOASSERTION
FreeCommunity
AI Systems
mcp-ollama-file-agent

Overview

What is Mcp Ollama File Agent

mcp-ollama-file-agent is a local AI agent that utilizes the Model Context Protocol (MCP) to interact with local filesystems, enabling advanced functionalities such as file summarization and tool usage.

Use cases

Use cases include summarizing reports from local files, automating data extraction from documents, enhancing productivity in software development by providing AI assistance, and creating interactive applications with a user-friendly interface.

How to use

To use mcp-ollama-file-agent, run the Docker container using the command ‘docker-compose up --build’. Access the Streamlit UI at http://localhost:8501 to interact with the agent and view outputs in the terminal.

Key features

Key features include the ability to run local large language models (LLMs) like qwen2:7b, mistral, and phi3, enable tool use via MCP, fetch and summarize files, and a fully Dockerized setup for offline capabilities.

Where to use

mcp-ollama-file-agent can be used in various fields such as data analysis, document summarization, software development, and any application requiring local file interaction and AI capabilities.

Content

🧠 Ollama MCP Local Agent

Local AI Agents with Tool Use (Filesystem) Powered by Ollama + LangChain + MCP

🚀 Features

  • Run local LLMs like qwen2:7b, mistral, phi3, etc.
  • Enable tool use via Model Context Protocol (MCP)
  • Fetch and summarize files from your filesystem
  • Streamlit UI + Python Agent
  • Fully Dockerized & offline-capable

📦 Project Structure

.
├── agent_runner.py
├── file_tool.py
├── mcp-tool.json
├── streamlit_app.py
├── Dockerfile
├── Dockerfile.agent
├── docker-compose.yml
├── docs/
│   └── example.txt
├── .gitignore
└── README.md

🐳 Run with Docker

docker-compose up --build

📜 License

MIT

Tools

No tools

Comments

Recommend MCP Servers

View All MCP Servers