- Explore MCP Servers
- crewai-mcp-neo4j-fastapi
Crewai Mcp Neo4j Fastapi
What is Crewai Mcp Neo4j Fastapi
crewai-mcp-neo4j-fastapi is a FastAPI server that integrates CrewAI with Neo4j to process queries related to data stored in a Neo4j graph database.
Use cases
Use cases include querying complex relationships in social networks, analyzing data in recommendation systems, and managing knowledge graphs for enhanced data retrieval.
How to use
To use crewai-mcp-neo4j-fastapi, set up the environment by copying the sample.env file to .env and filling in the necessary values. Install dependencies using ‘poetry install’ and start the server with ‘poetry run uvicorn main:app --reload --port 4000’. Access the interactive documentation at http://127.0.0.1:8000/docs.
Key features
Key features include integration with Neo4j for graph database queries, support for LLM inference via OpenAI, and easy setup and management using Poetry.
Where to use
crewai-mcp-neo4j-fastapi can be used in various fields such as data analytics, artificial intelligence applications, and any domain requiring graph database management and querying.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Crewai Mcp Neo4j Fastapi
crewai-mcp-neo4j-fastapi is a FastAPI server that integrates CrewAI with Neo4j to process queries related to data stored in a Neo4j graph database.
Use cases
Use cases include querying complex relationships in social networks, analyzing data in recommendation systems, and managing knowledge graphs for enhanced data retrieval.
How to use
To use crewai-mcp-neo4j-fastapi, set up the environment by copying the sample.env file to .env and filling in the necessary values. Install dependencies using ‘poetry install’ and start the server with ‘poetry run uvicorn main:app --reload --port 4000’. Access the interactive documentation at http://127.0.0.1:8000/docs.
Key features
Key features include integration with Neo4j for graph database queries, support for LLM inference via OpenAI, and easy setup and management using Poetry.
Where to use
crewai-mcp-neo4j-fastapi can be used in various fields such as data analytics, artificial intelligence applications, and any domain requiring graph database management and querying.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
FastAPI server using CrewAI and Neo4j MCP
This is a simple FastAPI server that uses CrewAI and Neo4j MCP to process queries about the data within a Neo4j graph database.
Requirements
- Poetry for virtual env and dependency management
- OpenAI Key for LLM inference
- Running Neo4j database
Setup
- Copy the sample.env file to .env and fill in the values.
- Run
poetry install
Start FastAPI Server
poetry run uvicorn main:app --reload --port 4000
Interactive docs will be accessible at:
http://127.0.0.1:8000/docs
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










