- Explore MCP Servers
- Finance-Assistant-with-MCP-and-Langchain
Finance Assistant With Mcp And Langchain
What is Finance Assistant With Mcp And Langchain
Finance-Assistant-with-MCP-and-Langchain is a conversational finance assistant that allows users to interact using natural language to obtain real-time stock quotes, market news, and insights on market movers.
Use cases
Use cases include asking for current stock prices, recent news about specific companies, identifying top market gainers, and obtaining general market news.
How to use
Users can access the assistant through a web interface built with Streamlit. Simply type questions related to stock prices, market news, or specific companies to receive instant responses.
Key features
Key features include natural language queries, real-time data retrieval from financial APIs, secure API key management, conversational responses generated by the LLM, follow-up suggestions, and a modular architecture that separates UI from backend logic.
Where to use
This assistant is suitable for use in finance-related fields, including personal finance management, investment analysis, and stock market research.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Finance Assistant With Mcp And Langchain
Finance-Assistant-with-MCP-and-Langchain is a conversational finance assistant that allows users to interact using natural language to obtain real-time stock quotes, market news, and insights on market movers.
Use cases
Use cases include asking for current stock prices, recent news about specific companies, identifying top market gainers, and obtaining general market news.
How to use
Users can access the assistant through a web interface built with Streamlit. Simply type questions related to stock prices, market news, or specific companies to receive instant responses.
Key features
Key features include natural language queries, real-time data retrieval from financial APIs, secure API key management, conversational responses generated by the LLM, follow-up suggestions, and a modular architecture that separates UI from backend logic.
Where to use
This assistant is suitable for use in finance-related fields, including personal finance management, investment analysis, and stock market research.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
Conversational Finance Assistant (FastMCP + Langchain + Streamlit)
This project demonstrates building a conversational financial assistant capable of retrieving real-time stock quotes, news headlines, and market mover data using natural language queries.
It features a decoupled architecture:
- Backend: A secure server built with FastMCP acting as a gateway to financial APIs (Finnhub, Alpha Vantage).
- Frontend: A user-friendly web interface built with Streamlit.
- Agent: Powered by Langchain and the OpenAI API (GPT models) to understand user requests, utilize backend tools, and generate conversational responses.
Technical Report: Report
Features
- Natural Language Queries: Ask questions like:
- “What’s the price of Apple?”
- “How is MSFT doing today?”
- “Any recent news for TSLA?”
- “Show me the top gainers today.”
- “What’s the market news?”
- Real-time Data: Fetches current stock quotes, recent news, and market movers via external APIs.
- Secure API Key Management: Financial API keys are stored securely on the backend MCP server, not exposed in the frontend or to the LLM.
- Conversational Responses: The LLM synthesizes data fetched via tools into easy-to-understand answers.
- Follow-up Suggestions: Provides relevant next questions to continue the conversation.
- Modular Architecture: Decouples the UI/Agent logic from the backend data fetching logic using the Model Context Protocol (MCP).
Architecture
The system uses a client-server architecture orchestrated by a Langchain agent:
- User Interface (Streamlit): Handles chat display, user input, and suggestion buttons.
- Langchain Agent Executor: Resides in the Streamlit app. Uses
ChatOpenAIand definedStructuredTools. Manages the conversation flow, calls the LLM, and executes tools when requested. - OpenAI LLM: Interprets user intent, decides when to call tools, synthesizes final responses from tool results.
- Langchain Tools (in UI): Python functions (
get_price,get_news,get_market_movers) defined within the UI code. These tools are invoked by the Agent Executor. - FastMCP Client (in UI Tools): The Langchain tools use
fastmcp.Clientto communicate with the backend MCP server. - FastMCP Server (Backend): A separate Python process (
fin_server_v2.py). Exposes financial data fetching capabilities as secure MCP Tools (@mcp.tool()) and Resources (@mcp.resource()). Handles interaction with external financial APIs. - Financial APIs: Finnhub and Alpha Vantage (can be extended).
Setup and Installation
Prerequisites:
- Python 3.10+
uv(recommended) orpip- API Keys:
- OpenAI API Key (platform.openai.com/account/api-keys)
- Finnhub API Key (finnhub.io)
- Alpha Vantage API Key (alphavantage.co)
Steps:
-
Clone the Repository:
git clone <your-repo-url> cd <your-repo-name> -
Create
.envFile:
Create a file named.envin the project root and add your API keys:# .env file FINNHUB_API_KEY=YOUR_FINNHUB_KEY ALPHA_VANTAGE_API_KEY=YOUR_ALPHA_VANTAGE_KEY OPENAI_API_KEY=sk-YOUR_OPENAI_KEY(Replace the placeholder values with your actual keys)
-
Create Virtual Environment:
uv venv # Creates a .venv folder source .venv/bin/activate # On Linux/macOS # .\venv\Scripts\activate # On Windows CMD/PowerShell -
Install Dependencies:
uv pip install -r requirements.txt # OR if you don't have a requirements.txt yet: # uv pip install streamlit "fastmcp" httpx python-dotenv pydantic-settings openai langchain langchain-openai pydantic langchainhub "langchain-community"(See
requirements.txtfor specific tested versions)
Running the Application
You need to run the backend MCP server and the frontend Streamlit UI separately.
-
Run the Backend MCP Server:
Open a terminal, activate the virtual environment, and run:python fin_server_v2.pyKeep this terminal window open. You should see log messages indicating it started successfully and loaded API keys.
-
Run the Frontend Streamlit UI:
Open a second terminal window, activate the same virtual environment, and run:streamlit run fin_langchain_v2.pyStreamlit will provide a local URL (usually
http://localhost:8501). Open this URL in your web browser. -
Interact: Start asking financial questions in the chat interface!
Code Structure
fin_server_v2.py: The backend FastMCP server application. Contains tool and resource definitions, interacts with financial APIs.fin_langchain_v2.py: The frontend Streamlit application. Contains the Langchain agent setup, UI components, and helper functions to call the MCP server..env(You create this): Stores API keys securely.requirements.txt(You create this or use the one provided): Lists Python dependencies.
Future Improvements
- Add more financial tools (historical data, fundamentals, analyst ratings).
- Implement more sophisticated error handling and API fallback logic.
- Improve NLU for ticker/company name recognition.
- Integrate Langchain memory more deeply for multi-turn context.
- Add data visualization (charts) to the Streamlit UI.
- Implement server-side caching for financial APIs.
Contributing
Contributions are welcome! Please feel free to submit a Pull Request or open an Issue.
License
This project is licensed under the MIT License.
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










