- Explore MCP Servers
- news_search_mcp
News Search Mcp
What is News Search Mcp
news_search_mcp is an AI News Podcast Generator that utilizes a Flask-based server to create podcast scripts focused on AI news, interfacing with local Ollama models through a React application.
Use cases
Use cases for news_search_mcp include generating daily AI news podcasts, creating educational content about AI advancements, and providing updates for tech enthusiasts through audio formats.
How to use
To use news_search_mcp, first set up the MCP AI News Podcast Server by installing the required Python packages and running the server. Then, start the Ollama News Interface to interact with the server via a user-friendly React application.
Key features
Key features of news_search_mcp include podcast script generation using AI, integration with local Ollama models, a Flask-based backend for handling requests, and a React frontend for user interaction.
Where to use
news_search_mcp can be used in fields such as media, education, and technology, where there is a need for automated content generation and dissemination of AI-related news.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is News Search Mcp
news_search_mcp is an AI News Podcast Generator that utilizes a Flask-based server to create podcast scripts focused on AI news, interfacing with local Ollama models through a React application.
Use cases
Use cases for news_search_mcp include generating daily AI news podcasts, creating educational content about AI advancements, and providing updates for tech enthusiasts through audio formats.
How to use
To use news_search_mcp, first set up the MCP AI News Podcast Server by installing the required Python packages and running the server. Then, start the Ollama News Interface to interact with the server via a user-friendly React application.
Key features
Key features of news_search_mcp include podcast script generation using AI, integration with local Ollama models, a Flask-based backend for handling requests, and a React frontend for user interaction.
Where to use
news_search_mcp can be used in fields such as media, education, and technology, where there is a need for automated content generation and dissemination of AI-related news.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
AI News Podcast Generator with Ollama Interface
This project consists of two main components:
- MCP AI News Podcast Server: A Flask-based server that generates podcast scripts about AI news using local Ollama models.
- Ollama News Interface: A React application that provides a user interface to interact with both local Ollama models and the MCP AI News Podcast Server.
Project Structure
/news_search_mcp ├── mcp_ai_news_podcast_server/ # Flask server for podcast generation │ ├── src/ # Server source code │ │ ├── main.py # Flask app entry point │ │ ├── news_fetcher.py # News fetching module (conceptual) │ │ ├── ollama_interactor.py # Ollama API interaction │ │ └── script_generator.py # Podcast script generation │ ├── requirements.txt # Python dependencies │ └── README.md # Server documentation ├── ollama-news-interface/ # React frontend │ ├── src/ # React source code │ │ ├── App.js # Main React component │ │ ├── App.css # Styling │ │ └── setupProxy.js # Proxy configuration │ ├── package.json # Node.js dependencies │ └── README.md # Frontend documentation ├── mcp_server_architecture.md # System architecture document └── README.md # This file
Prerequisites
- Python 3.10+ for the Flask server
- Node.js and npm for the React application
- Ollama installed and running locally
- curl for API requests
Setup and Running
1. Start the MCP AI News Podcast Server
cd mcp_ai_news_podcast_server
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -r requirements.txt
python src/main.py
The server will start on http://127.0.0.1:5000
2. Start the Ollama News Interface
cd ollama-news-interface
npm install
npm start
The React application will start on http://localhost:3000
How to Use
- Open the React application in your browser at http://localhost:3000
- Select an Ollama model from the dropdown list
- Enter your query in the input text box:
- For general queries, the application will send them directly to the selected Ollama model
- For AI news-related queries (containing keywords like “news”, “AI news”, etc.), the application will send the request to the MCP AI News Podcast Server
- Click “Submit” to process your query
- The response will appear in the output box below
Features
- Model Selection: Choose from locally installed Ollama models
- Automatic Query Routing: Detects AI news queries and routes them to the podcast server
- Podcast Script Generation: Creates structured podcast scripts about AI news
- Error Handling: Provides clear error messages for troubleshooting
Notes
- The news fetching functionality is conceptual and uses mock data for testing
- In a real scenario, an AI agent would handle the actual news fetching using web search tools
- The podcast server requires Ollama to be installed and running locally
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










