- Explore MCP Servers
- tiny3-ai-agent
Tiny3 Ai Agent
What is Tiny3 Ai Agent
tiny3-ai-agent is an AI agent system that integrates various tools, featuring a FastAPI backend and a chat interface. It supports functionalities such as a calculator, web search, and code execution through MCP tools, enhanced by LM Studio integration.
Use cases
Use cases for tiny3-ai-agent include performing mathematical calculations, searching for programming tutorials online, and executing code snippets in real-time, making it versatile for both educational and professional environments.
How to use
To use tiny3-ai-agent, start all services by running ‘./start_servers.sh’. Open the frontend in a browser, and optionally start LM Studio for full AI capabilities. You can test the system with example prompts for the calculator, web search, and code execution tools.
Key features
Key features of tiny3-ai-agent include a modular architecture with FastAPI, integration of multiple MCP tools (calculator, web search, code executor), and support for advanced AI functionalities through LM Studio.
Where to use
tiny3-ai-agent can be used in various fields such as education for learning assistance, software development for code execution and debugging, and general productivity applications for quick calculations and information retrieval.
Overview
What is Tiny3 Ai Agent
tiny3-ai-agent is an AI agent system that integrates various tools, featuring a FastAPI backend and a chat interface. It supports functionalities such as a calculator, web search, and code execution through MCP tools, enhanced by LM Studio integration.
Use cases
Use cases for tiny3-ai-agent include performing mathematical calculations, searching for programming tutorials online, and executing code snippets in real-time, making it versatile for both educational and professional environments.
How to use
To use tiny3-ai-agent, start all services by running ‘./start_servers.sh’. Open the frontend in a browser, and optionally start LM Studio for full AI capabilities. You can test the system with example prompts for the calculator, web search, and code execution tools.
Key features
Key features of tiny3-ai-agent include a modular architecture with FastAPI, integration of multiple MCP tools (calculator, web search, code executor), and support for advanced AI functionalities through LM Studio.
Where to use
tiny3-ai-agent can be used in various fields such as education for learning assistance, software development for code execution and debugging, and general productivity applications for quick calculations and information retrieval.
Content
Tiny3 AI Agent System
A complete AI agent chat interface implementing your original PRD with Hugging Face tiny_agents architecture, FastAPI backend, and MCP tool integration.
🏗️ Architecture
Frontend (HTML/JS) → FastAPI Backend → tiny_agents → LM Studio + MCP Tools ↓ Calculator, Web Search, Code Executor (HTTP Services)
🚀 Quick Start
1. Start All Services
./start_servers.sh
This will start:
- 🧮 Calculator MCP Tool (port 5001)
- 🔍 Web Search MCP Tool (port 5002)
- 💻 Code Executor MCP Tool (port 5003)
- ⚡ Main FastAPI Backend (port 8000)
2. Open Frontend
# Open in browser
open frontend/index.html
# Or navigate to: file:///path/to/tiny3/frontend/index.html
3. Optional: Start LM Studio
For full AI functionality, start LM Studio on http://localhost:1234
- Download LM Studio from https://lmstudio.ai
- Load a compatible model (Llama 3.2, CodeLlama, etc.)
- Start local server on port 1234
4. Test the System
Try these example prompts:
- “What’s 15 * 23?” (Calculator tool)
- “Search for Python tutorials” (Web search tool)
- “Run this code: print(‘Hello World!’)” (Code executor tool)
🛠️ Manual Testing
Test Individual Tools
# Calculator
curl -X POST http://localhost:5001/run \
-H "Content-Type: application/json" \
-d '{"tool": "calculator", "args": {"expression": "2+2"}}'
# Web Search
curl -X POST http://localhost:5002/run \
-H "Content-Type: application/json" \
-d '{"tool": "web_search", "args": {"query": "Python programming"}}'
# Code Executor
curl -X POST http://localhost:5003/run \
-H "Content-Type: application/json" \
-d '{"tool": "code_executor", "args": {"code": "print(\"Hello!\")", "language": "python"}}'
Test Backend API
# Health check
curl http://localhost:8000/health
# Chat endpoint
curl -X POST http://localhost:8000/chat \
-H "Content-Type: application/json" \
-d '{"prompt": "What is 5 + 3?", "history": []}'
🏃 Stop Services
./stop_servers.sh
📁 Project Structure
tiny3/ ├── frontend/ │ ├── index.html # Chat interface │ ├── styles.css # UI styling │ └── script.js # Frontend logic ├── backend/ │ ├── main.py # FastAPI server │ ├── tiny_agents.py # Agent implementation │ ├── requirements.txt # Python dependencies │ └── mcp_tools/ │ ├── calculator.py # Math tool server │ ├── web_search.py # Search tool server │ └── code_executor.py # Code tool server ├── start_servers.sh # Start all services ├── stop_servers.sh # Stop all services └── README.md # This file
🔧 Configuration
Backend Settings
Edit backend/main.py
to configure:
- LM Studio endpoint (default:
http://localhost:1234/v1
) - Tool server ports
- CORS settings
Tool Capabilities
- Calculator: Safe mathematical expressions, functions (sin, cos, sqrt, etc.)
- Web Search: DuckDuckGo API with fallback to mock responses
- Code Executor: Python, JavaScript, Bash with 10-second timeout
🐛 Troubleshooting
Port Conflicts
# Check what's using ports
lsof -i :5001
lsof -i :5002
lsof -i :5003
lsof -i :8000
# Kill specific processes
./stop_servers.sh
LM Studio Connection
- Ensure LM Studio is running on port 1234
- Check model is loaded and server is started
- Verify OpenAI-compatible API is enabled
Frontend CORS Issues
- Use
http://localhost:8000/static/
instead offile://
- Or start a simple HTTP server:
python -m http.server 3000
UI/UX Features
- Enhanced Scrollbar: 16px width with gradient styling and smooth hover effects
- Finite Window Scrolling: Messages scroll within chat area, not entire page
- Container Boundaries: All content stays within the white rounded container
- Cross-Browser Support: Compatible scrollbar styling for WebKit and Firefox browsers
- Responsive Design: Scrollbar adapts to different screen sizes and content lengths
📊 Data Flow Example
- User types “What’s 2+2?” in frontend
- Frontend sends JSON to
/chat
endpoint - tiny_agents processes prompt + tool registry
- Detects calculator tool call needed
- Sends request to calculator MCP server (port 5001)
- Calculator returns {“result”: “4”}
- Result integrated back to LM Studio for final response
- Frontend displays: “The answer is 4” + tool usage info
🎯 Features Implemented
✅ Complete PRD Implementation
- GUI (Browser Front-End) with HTML/JS
- Tiny Agents Backend (Python Service)
- HTTP Server with FastAPI /chat endpoint
- LM Studio Model Server integration
- MCP Tool Servers (calculator, web_search, code_executor)
- Exact data flow as specified
✅ Additional Features
- Real-time tool status indicators
- Message history management
- Error handling and graceful degradation
- Responsive design with enhanced scrollbar visibility
- Finite window scrolling within chat messages area
- Custom scrollbar styling with 16px width and gradient effects
- Cross-browser scrollbar compatibility (WebKit and Firefox)
- Development/testing utilities
🎉 Ready for production use with LM Studio!