- Explore MCP Servers
- MCP_Assistant
Mcp Assistant
What is Mcp Assistant
MCP_Assistant is an AI assistant built on the Model Context Protocol (MCP) using Groq and LangChain, featuring a Streamlit frontend and FastAPI backend for chat interactions powered by large language models.
Use cases
Use cases include automated customer service chatbots, educational tutoring systems, and interactive virtual assistants for web browsing and information retrieval.
How to use
To use MCP_Assistant, clone the repository, install the dependencies, set up your environment variables, configure optional MCP tools, and run the FastAPI backend followed by the Streamlit frontend.
Key features
Key features include Groq LLM integration with memory-enabled conversation, MCP server integration for Google Search and Playwright, a FastAPI backend for chat API requests, a real-time chat interface via Streamlit, and Docker support for containerized deployment.
Where to use
MCP_Assistant can be used in various fields such as customer support, educational tools, and any application requiring interactive chat capabilities powered by AI.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Mcp Assistant
MCP_Assistant is an AI assistant built on the Model Context Protocol (MCP) using Groq and LangChain, featuring a Streamlit frontend and FastAPI backend for chat interactions powered by large language models.
Use cases
Use cases include automated customer service chatbots, educational tutoring systems, and interactive virtual assistants for web browsing and information retrieval.
How to use
To use MCP_Assistant, clone the repository, install the dependencies, set up your environment variables, configure optional MCP tools, and run the FastAPI backend followed by the Streamlit frontend.
Key features
Key features include Groq LLM integration with memory-enabled conversation, MCP server integration for Google Search and Playwright, a FastAPI backend for chat API requests, a real-time chat interface via Streamlit, and Docker support for containerized deployment.
Where to use
MCP_Assistant can be used in various fields such as customer support, educational tools, and any application requiring interactive chat capabilities powered by AI.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
MCP based AI Assistant
This is an MCP based AI Assistant built using Groq, LangChain, and MCP, with a Streamlit frontend and FastAPI backend. It enables chat interactions powered by large language models, enhanced with optional tools like search agents and browser automation.
Features
- Groq LLM with LangChain and memory enabled conversation
- MCP server such as Google Search and Playwright integration
- FastAPI backend to handle chat API requests
- Streamlit frontend for real-time chat
- Utilized Docker for containerized deployment
Project Structure
MCP_Assistant/ ├── app.py ├── main.py # FastAPI server with chat endpoint ├── streamlit_app.py # streamlit frontend UI ├── browser_mcp.json # MCP servers ├── .env # Environment variables (API keys) ├── requirements.txt # required dependencies └── README.md # You are here!
Setup Instructions
1. Clone and Install Dependencies
git clone https://github.com/itsabhishekm/MCP_Assistant.git
cd MCP_Assistant
python -m venv .venv
.venv\Scripts\activate
pip install -r requirements.txt
2. Set Environment Variables
In the .env file in the root past your groq API KEY:
GROQ_API_KEY=your_groq_api_key
3. Configure MCP Tools (Not Required but optional)
If you want to add any other MCP server customize browser_mcp.json:
"mcpServers": { "google-search": { "command": "npx", "args": ["-y", "@mcp-server/google-search-mcp@latest"] }, "playwright": { "command": "npx", "args": ["@playwright/mcp@latest"] } } }
Or if you don’t want any MCP server, just leave it empty:
{ "mcpServers": {} }
Running the app
1. Start FastAPI Backend
uvicorn main:app --reload --port 8000
2. Launch Streamlit Frontend
streamlit run streamlit_app.py
Visit: http://localhost:8501
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.