- Explore MCP Servers
- LLM-MCP-Travel-Orchestrator
Llm Mcp Travel Orchestrator
What is Llm Mcp Travel Orchestrator
The LLM-MCP Travel Orchestrator is a sophisticated multi-agent travel accommodation system that utilizes OpenAIβs GPT-4o-mini, LangChain, and the Multi-Agent Collaboration Protocol (MCP) for intelligent property search and recommendations. The system orchestrates multiple AI agents to handle tasks such as query parsing, filtering, summarization, and providing real-time accommodation options.
Use cases
The system allows users to conduct property searches based on location, filter accommodations by amenities, specify price ranges, and receive dynamic recommendations. It enables context-aware conversations with personalized suggestions and supports multi-agent collaboration for complex queries.
How to use
To use the system, clone the repository, set up a Python virtual environment, install required dependencies, and configure your OpenAI API key in a .env file. Run the application using Streamlit, and access it through a web browser at the specified local address to interact with the chatbot for accommodation queries.
Key features
Key features include chain-of-thought reasoning through LangChain, real-time property data retrieval via MCP, context-aware conversation management, intelligent response generation, dynamic filtering options, and personalized accommodation recommendations based on user input.
Where to use
This system can be effectively used in travel agencies, online travel platforms, and hospitality services that require efficient accommodation searches and real-time recommendations, enhancing user experiences in the travel planning process.
Overview
What is Llm Mcp Travel Orchestrator
The LLM-MCP Travel Orchestrator is a sophisticated multi-agent travel accommodation system that utilizes OpenAIβs GPT-4o-mini, LangChain, and the Multi-Agent Collaboration Protocol (MCP) for intelligent property search and recommendations. The system orchestrates multiple AI agents to handle tasks such as query parsing, filtering, summarization, and providing real-time accommodation options.
Use cases
The system allows users to conduct property searches based on location, filter accommodations by amenities, specify price ranges, and receive dynamic recommendations. It enables context-aware conversations with personalized suggestions and supports multi-agent collaboration for complex queries.
How to use
To use the system, clone the repository, set up a Python virtual environment, install required dependencies, and configure your OpenAI API key in a .env file. Run the application using Streamlit, and access it through a web browser at the specified local address to interact with the chatbot for accommodation queries.
Key features
Key features include chain-of-thought reasoning through LangChain, real-time property data retrieval via MCP, context-aware conversation management, intelligent response generation, dynamic filtering options, and personalized accommodation recommendations based on user input.
Where to use
This system can be effectively used in travel agencies, online travel platforms, and hospitality services that require efficient accommodation searches and real-time recommendations, enhancing user experiences in the travel planning process.
Content
π€ LLM-MCP Travel Orchestrator
A sophisticated multi-agent travel accommodation system leveraging OpenAIβs GPT-4o-mini, LangChain, and the Model Context Protocol (MCP) to provide intelligent property search and recommendations. This system orchestrates multiple AI agents for query parsing, filtering, summarization, and real-time accommodation recommendations.
π§ Technical Architecture
Multi-Agent System
- LLM Orchestration
- GPT-4o-mini powered natural language understanding
- Multi-agent collaboration for complex tasks
- Context-aware conversation management
- Intelligent response generation
- LangChain Integration
- Chain-of-thought reasoning
- Tool-based execution
- Memory management
- Response formatting
- MCP Server Integration
- Real-time property data access
- Asynchronous communication
- Robust error handling
- Efficient data retrieval
Core Components
- LLM Agent Layer
- LangChain Integration Layer
- MCP Integration Layer
- User Interface Layer
π Getting Started
Prerequisites
- Python 3.11 or higher
- Node.js and npm
- OpenAI API key (Get one here)
Installation
- Clone the Repository
git clone https://github.com/ANUVIK2401/LLM-MCP-Travel-Orchestrator.git
cd LLM-MCP-Travel-Orchestrator
- Set Up Virtual Environment
python -m venv venv
# Activate virtual environment
# On macOS/Linux:
source venv/bin/activate
# On Windows:
.\venv\Scripts\activate
- Install Dependencies
pip install -r requirements.txt npm install -g @openbnb/mcp-server-airbnb
- Configure Environment
Create a.env
file in the project root and add your OpenAI API key:
OPENAI_API_KEY=your_api_key_here
β οΈ Important: Never commit your
.env
file or share your API key. The.env
file is already in.gitignore
for security.
Running the Application
streamlit run chatbot.py
Then open your browser and navigate to: http://localhost:8501
π‘ Usage Guide
- Property search by location
- Amenity-based filtering
- Price range specifications
- Location-based recommendations
- Multi-agent collaboration
- Context-aware conversations
- Dynamic filtering options
- Personalized recommendations
πΈ Screenshots
Main Chatbot Interface | Property Search Results |
---|---|
![]() |
![]() |
Multi-Agent Collaboration | Real-Time Recommendations |
---|---|
![]() |
![]() |
ποΈ Project Structure
LLM-MCP-Travel-Orchestrator/ βββ assets/ β βββ images/ βββ chatbot.py βββ airbnb_use.py βββ airbnb_mcp.json βββ requirements.txt βββ pyproject.toml βββ pytest.ini βββ LICENSE βββ .gitignore βββ docs/ βββ mcp_use/ β βββ agents/ β βββ connectors/ β βββ task_managers/ β βββ client.py β βββ config.py β βββ logging.py β βββ session.py β βββ __init__.py βββ tests/ β βββ conftest.py β βββ unit/ β βββ test_client.py β βββ test_config.py β βββ test_http_connector.py β βββ test_logging.py β βββ test_session.py β βββ test_stdio_connector.py βββ venv/
π Documentation
- See the
docs/
directory for detailed guides, quickstart, and API reference. - Example: docs/introduction.mdx, docs/quickstart.mdx
π οΈ Development
- Fork the repository
- Create a feature branch
- Set up your development environment
- Make your changes
- Test thoroughly (see
tests/
directory) - Submit a pull request
Key Dependencies
- streamlit==1.32.0
- python-dotenv==1.0.0
- mcp-use==1.1.5
- langchain-openai>=0.0.5
- langchain-community>=0.0.34
- langchain>=0.1.16
π Security Considerations
- Keep your API keys secure
- Never commit sensitive information
- Use environment variables
- Regular dependency updates
- Follow security best practices
π€ Contributing
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature
) - Commit your changes (
git commit -m 'Add some AmazingFeature'
) - Push to the branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
π License
This project is licensed under the MIT License - see the LICENSE file for details.
π Acknowledgments
- openbnb-org/mcp-server-airbnb for the MCP server
- OpenAI for the GPT models
- LangChain for the agent framework
- Streamlit for the web framework
π Support
For support:
- Check the Issues page
- Create a new issue if your problem isnβt already listed
- Contact the maintainers for urgent issues
Made with β€οΈ by [Anuvik Thota]