MCP ExplorerExplorer

Llm Mcp Fastapi

@ahmadalsharef994on 13 days ago
1 MIT
FreeCommunity
AI Systems
A minimal FastAPI backend for exposing tools to local LLMs using MCP.

Overview

What is Llm Mcp Fastapi

llm-mcp-fastapi is a minimal backend designed to expose tools to local LLMs (like LLaMA 3 via Ollama) using the Model Context Protocol (MCP).

Use cases

Use cases include retrieving information like weather data through tool invocation, integrating LLMs with other applications, and enhancing AI functionalities.

How to use

To use llm-mcp-fastapi, clone the repository, set up a virtual environment, install the required dependencies, and run the FastAPI application using Uvicorn.

Key features

Key features include a FastAPI backend, tool listing at ‘/mcp/tools/list’, tool invocation at ‘/mcp/tools/invoke’, and compatibility with Ollama and any LLM that supports tool calling.

Where to use

llm-mcp-fastapi can be used in various fields such as AI development, natural language processing, and any application requiring interaction with local LLMs.

Content

MCP + FastAPI + Ollama 🧠🚀

This repo is a minimal backend for exposing tools to local LLMs (like LLaMA 3 via Ollama) using the Model Context Protocol (MCP).

🛠 Features

  • [x] FastAPI backend
  • [x] Tool listing (/mcp/tools/list)
  • [x] Tool invocation (/mcp/tools/invoke)
  • [x] Example: get_weather(city)
  • [x] Works with Ollama and any LLM that supports tool calling

🚀 Quick Start

git clone https://github.com/YOUR_USERNAME/llm-mcp-fastapi.git
cd llm-mcp-fastapi
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
uvicorn app.main:app --reload

Tools

No tools

Comments