MCP ExplorerExplorer

Aiagent Mcp

@AhilanPonnusamyon 9 months ago
1 MIT
FreeCommunity
AI Systems
A simple local POC of AI Agents and MCP protocol

Overview

What is Aiagent Mcp

AIAgent-MCP is a simple local proof of concept (POC) for AI agents utilizing the MCP protocol. It demonstrates a lightweight agentic AI system powered by the quantized gemma3:12b model running on Ollama, integrating various tools for enhanced reasoning capabilities.

Use cases

Use cases for AIAgent-MCP include querying for the current time, retrieving the latest AI news articles, and summarizing complex texts, making it suitable for both casual users and developers looking to integrate AI functionalities.

How to use

To use AIAgent-MCP, clone the Git repository, create and activate a Python virtual environment, start the Ollama server with the gemma3:12b model, run the MCP tool server, and then start the agent client and Streamlit UI to interact with the AI agent.

Key features

Key features include integration with the MCP protocol, the ability to perform tool-augmented reasoning, and functionalities such as fetching the current time, summarizing articles, and retrieving the latest AI news.

Where to use

AIAgent-MCP can be used in various fields including AI research, software development, educational tools, and any application requiring interactive AI assistance and information retrieval.

Content

AIAgent-MCP: Agentic AI App with MCP and OpenAPI Integration

Additional details about this POC is provided in this blog.

This project demonstrates a lightweight agentic AI system powered by a quantized gemma3:12b model running on Ollama, with tool integration via the MCP server. The goal is to test tool-augmented reasoning like fetching current time, summarizing articles, or retrieving the latest AI news.


🚀 Getting Started

1. Clone the Git Repository

git clone https://github.com/AhilanPonnusamy/AIAgent-MCP.git
cd AIAgent-MCP

2. Create and Activate a Virtual Environment

Use Python 3.11 and the following naming convention to avoid updating configs manually.

python3.11 -m venv mcplatest-venv
source mcplatest-venv/bin/activate

3. Start the Ollama Server and Model

Run the Gemma 3 12B model locally using Ollama:

ollama run gemma3:12b

4. Start the MCPO Tool Server

uvx mcpo --config config.json --port 8001

Access Tool Endpoint: http://localhost:8001/ainews/docs

5. Start the Agent (Client)

In a new terminal (inside virtual environment):

source mcplatest-venv/bin/activate
uvicorn agentic_client:app --reload --port 8000

6. Start the Streamlit UI

In another terminal (also inside virtual environment):

source mcplatest-venv/bin/activate
streamlit run app-ui.py

Example Prompts to Try

Start chatting from the Streamlit UI!

“What is the distance to the moon?”

“What is the current time?” → Should trigger the time tool.

“Can you get latest AI news?” → Should trigger ainews tool and return top 10 articles.

“Can you summarize this article for me: Efficient Fine-Tuning of Language Models with Low-Rank Adapters - https://arxiv.org/abs/2405.16746?” → Should invoke fetch tool and return a nice summary.

[!WARNING]
Due to the nature of low precision quantized models, behavior may sometimes be inconsistent. When in doubt: restart servers and LLM (ollama stop)
things should be back to normal!

🎉 Have Fun!

This project is your sandbox for building powerful local agentic systems with reasoning and tool-usage capabilities. Extend, explore, and enjoy! 😄

Tools

No tools

Comments

Recommend MCP Servers

View All MCP Servers