- Explore MCP Servers
- pa_agent
Pa Agent
What is Pa Agent
pa_agent is a modular personal assistant built with LangGraph, designed to facilitate conversational interactions and retrieval-augmented generation (RAG) over various document types. It integrates live data through an MCP server with over 25 endpoints and supports both long-term and short-term memory using PostgreSQL and Redis.
Use cases
Use cases for pa_agent include providing instant customer support through conversational interfaces, summarizing documents, retrieving real-time financial data, and assisting users in managing their projects and tasks.
How to use
To use pa_agent, clone the repository from GitHub, set up the necessary prerequisites including Python, Docker, Redis, and PostgreSQL, and run the CLI entry point located in app/run.py
. You can also access it programmatically via the Python SDK.
Key features
Key features of pa_agent include conversational chat capabilities, retrieval-augmented generation for various document formats, a robust MCP server with multiple endpoints, long-term memory storage in PostgreSQL, short-term memory for recent conversations, and various tool integrations for web searches, file handling, and financial data retrieval.
Where to use
pa_agent can be used in various fields such as customer support, personal productivity, data analysis, and financial services, where conversational AI and data retrieval are beneficial.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Pa Agent
pa_agent is a modular personal assistant built with LangGraph, designed to facilitate conversational interactions and retrieval-augmented generation (RAG) over various document types. It integrates live data through an MCP server with over 25 endpoints and supports both long-term and short-term memory using PostgreSQL and Redis.
Use cases
Use cases for pa_agent include providing instant customer support through conversational interfaces, summarizing documents, retrieving real-time financial data, and assisting users in managing their projects and tasks.
How to use
To use pa_agent, clone the repository from GitHub, set up the necessary prerequisites including Python, Docker, Redis, and PostgreSQL, and run the CLI entry point located in app/run.py
. You can also access it programmatically via the Python SDK.
Key features
Key features of pa_agent include conversational chat capabilities, retrieval-augmented generation for various document formats, a robust MCP server with multiple endpoints, long-term memory storage in PostgreSQL, short-term memory for recent conversations, and various tool integrations for web searches, file handling, and financial data retrieval.
Where to use
pa_agent can be used in various fields such as customer support, personal productivity, data analysis, and financial services, where conversational AI and data retrieval are beneficial.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
Personal Assistant Agent with LangGraph
A powerful conversational agent with retrieval-augmented generation, MCP server, long- and short-term memory, and tool integrations—powered by LangGraph, LangChain, Redis, PostgreSQL, and Pinecone.
Features
- Conversational Chat
- Interactive CLI (
app/run.py
) - Programmatic access via Python SDK (
langgraph_sdk
)
- Interactive CLI (
- Retrieval-Augmented Generation
- Ingest & query PDF, Markdown, HTML, CSV & DOCX into Pinecone
index_docs(name, path_or_url)
&query_index(name, question, k)
tools
- MCP Server
- 25+ CoinMarketCap endpoints exposed
- Long-Term Memory
- Profile, projects & instructions stored in PostgreSQL, namespaced by user
- Short-Term Memory
- Rolling summary of recent conversation (pruned after 10+ turns)
- Tool Integrations
- Web & Knowledge:
web_fetch
,wiki_search
,tavily_search
- File Handling:
inspect_file
,summarise_file
,extract_tables
,ocr_image
,save_uploaded_file
- Finance:
get_stock_quote
,get_stock_news
- Web & Knowledge:
- Robustness
- Automatic retries on transient OpenAI errors
- Healthchecks on Redis & Postgres in Docker Compose
Project Structure
personal_assistant/ ├── app/ │ ├── config.py │ ├── run.py # CLI entrypoint │ ├── graph/ │ │ ├── assistant.py # StateGraph definition │ │ ├── state.py # ChatState schema │ │ └── memory/ # summarization, schemas │ ├── tools/ # external tool fns │ ├── rag/ # RAG loaders & indexers │ ├── mcp/ # MCP server logic │ └── schemas/ ├── requirements.txt ├── docker-compose.yml ├── langgraph.json └── README.md
Prerequisites
- Python 3.11+
- Docker & Docker Compose (for containerized deployment)
- Redis & PostgreSQL (local or via Docker)
- OpenAI, Pinecone, Tavily & CoinMarketCap API keys
Getting Started
Local Setup
-
Clone the repo:
git clone https://github.com/amanzoni1/pa_agent && cd personal_assistant
-
Create and activate a virtual environment:
python -m venv .venv source .venv/bin/activate
-
Install dependencies:
pip install -r requirements.txt
-
Create a
.env
file in project root:OPENAI_API_KEY=your_openai_key LANGSMITH_API_KEY=your_langsmith_key REDIS_URI=redis://localhost:6379 POSTGRES_URI=postgresql://postgres:postgres@localhost:5432/personal_assistant?sslmode=disable PINECONE_API_KEY=your_pinecone_key TAVILY_API_KEY=your_tavily_key COINMARKETCAP_API_KEY=your_cmc_key
-
Ensure Redis and Postgres are running locally.
Command-Line Interface (CLI)
Interact via chat in terminal:
python -m app.run --thread-id <optional-uuid> --user-id <optional-your_id>
Commands:
/memory
: Show long-term memory (profile, projects, instructions) stored./mcp
: get all the tools available from the MCP server./exit
or Ctrl-D: Quit.
Docker Compose Deployment
-
Build your LangGraph image:
langgraph build -t my-assistant
-
Launch via Docker Compose:
docker compose up -d
-
Access:
- API:
http://localhost:8123
- Swagger Docs:
http://localhost:8123/docs
- API:
Python SDK Usage
Install the SDK:
pip install langgraph_sdk
Example:
import asyncio
from langgraph_sdk import get_client
from langchain_core.messages import HumanMessage
async def main():
client = get_client(url="http://localhost:8123")
thread = await client.threads.create()
run = await client.runs.create(
thread["thread_id"],
"my-assistant",
input={"messages": [HumanMessage(content="Hello!")]},
config={"configurable": {"user_id": "you", "thread_id": thread["thread_id"]}},
)
final = await client.runs.join(thread["thread_id"], run["run_id"])
msgs = final.get("messages", [])
ai = msgs[1]["content"] if len(msgs) > 1 else None
print("AI:", ai)
asyncio.run(main())
DevTools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.