- Explore MCP Servers
- qdrant-loader
Qdrant Loader
What is Qdrant Loader
qdrant-loader is an enterprise-ready toolkit designed for building searchable knowledge bases from various data sources, integrating seamlessly with the Qdrant vector database and supporting multi-project management.
Use cases
Use cases include creating searchable knowledge bases from technical documentation, integrating AI assistants in development environments, and enhancing search capabilities in applications like Confluence and JIRA.
How to use
To use qdrant-loader, install the necessary packages and configure data source connectors for Git, Confluence, JIRA, and local files. Utilize its features for file conversion, intelligent document processing, and vector embedding to load data into the Qdrant database.
Key features
Key features include support for multiple data sources, automatic file conversion, AI-powered image descriptions, incremental updates, intelligent document processing, and performance monitoring. The MCP server enhances LLM integration with advanced search capabilities.
Where to use
qdrant-loader is suitable for industries requiring efficient knowledge management, such as software development, project management, and any domain that involves handling large volumes of technical documents and data.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Qdrant Loader
qdrant-loader is an enterprise-ready toolkit designed for building searchable knowledge bases from various data sources, integrating seamlessly with the Qdrant vector database and supporting multi-project management.
Use cases
Use cases include creating searchable knowledge bases from technical documentation, integrating AI assistants in development environments, and enhancing search capabilities in applications like Confluence and JIRA.
How to use
To use qdrant-loader, install the necessary packages and configure data source connectors for Git, Confluence, JIRA, and local files. Utilize its features for file conversion, intelligent document processing, and vector embedding to load data into the Qdrant database.
Key features
Key features include support for multiple data sources, automatic file conversion, AI-powered image descriptions, incremental updates, intelligent document processing, and performance monitoring. The MCP server enhances LLM integration with advanced search capabilities.
Where to use
qdrant-loader is suitable for industries requiring efficient knowledge management, such as software development, project management, and any domain that involves handling large volumes of technical documents and data.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
QDrant Loader
📋 Release Notes v0.4.10 - Latest improvements and bug fixes (June 18, 2025)
A comprehensive toolkit for loading data into Qdrant vector database with advanced MCP server support for AI-powered development workflows.
🎯 What is QDrant Loader?
QDrant Loader is a powerful data ingestion and retrieval system that bridges the gap between your technical content and AI development tools. It collects, processes, and vectorizes content from multiple sources, then provides intelligent search capabilities through a Model Context Protocol (MCP) server.
Perfect for:
- 🤖 AI-powered development with Cursor, Windsurf, and GitHub Copilot
- 📚 Knowledge base creation from scattered documentation
- 🔍 Intelligent code assistance with contextual documentation
- 🏢 Enterprise content integration from Confluence, JIRA, and Git repositories
📦 Packages
This monorepo contains two complementary packages:
Data ingestion and processing engine
Collects and vectorizes content from multiple sources into QDrant vector database.
Key Features:
- Multi-source connectors: Git, Confluence (Cloud & Data Center), JIRA (Cloud & Data Center), Public Docs, Local Files
- Advanced file conversion: 20+ file types including PDF, Office docs, images with AI-powered processing
- Intelligent chunking: Smart document processing with metadata extraction
- Incremental updates: Change detection and efficient synchronization
- Flexible embeddings: OpenAI, local models, and custom endpoints
AI development integration layer
Model Context Protocol server providing RAG capabilities to AI development tools.
Key Features:
- MCP protocol compliance: Full integration with Cursor, Windsurf, and Claude Desktop
- Advanced search tools: Semantic, hierarchy-aware, and attachment-focused search
- Confluence intelligence: Deep understanding of page hierarchies and relationships
- File attachment support: Comprehensive attachment discovery with parent document context
- Real-time processing: Streaming responses for large result sets
🚀 Quick Start
Installation
# Install both packages
pip install qdrant-loader qdrant-loader-mcp-server
# Or install individually
pip install qdrant-loader # Data ingestion only
pip install qdrant-loader-mcp-server # MCP server only
5-Minute Setup
-
Create a workspace
mkdir my-qdrant-workspace && cd my-qdrant-workspace -
Download configuration templates
curl -o config.yaml https://raw.githubusercontent.com/martin-papy/qdrant-loader/main/packages/qdrant-loader/conf/config.template.yaml curl -o .env https://raw.githubusercontent.com/martin-papy/qdrant-loader/main/packages/qdrant-loader/conf/.env.template -
Configure your environment (edit
.env)QDRANT_URL=http://localhost:6333 QDRANT_COLLECTION_NAME=my_docs OPENAI_API_KEY=your_openai_key -
Configure data sources (edit
config.yaml)sources: git: - url: "https://github.com/your-org/your-repo.git" branch: "main" -
Load your data
qdrant-loader --workspace . init qdrant-loader --workspace . ingest -
Start the MCP server
mcp-qdrant-loader
🎉 You’re ready! Your content is now searchable through AI development tools.
🔧 Integration Examples
Cursor IDE Integration
Add to .cursor/mcp.json:
{
"mcpServers": {
"qdrant-loader": {
"command": "/path/to/venv/bin/mcp-qdrant-loader",
"env": {
"QDRANT_URL": "http://localhost:6333",
"QDRANT_COLLECTION_NAME": "my_docs",
"OPENAI_API_KEY": "your_key",
"MCP_DISABLE_CONSOLE_LOGGING": "true"
}
}
}
}
Example Queries in Cursor
- “Find documentation about authentication in our API”
- “Show me examples of error handling patterns”
- “What are the deployment requirements for this service?”
- “Find all attachments related to database schema”
📁 Project Structure
qdrant-loader/ ├── packages/ │ ├── qdrant-loader/ # Core data ingestion package │ └── qdrant-loader-mcp-server/ # MCP server for AI integration ├── docs/ # Comprehensive documentation ├── website/ # Documentation website generator └── README.md # This file
📚 Documentation
🚀 Getting Started
- What is QDrant Loader? - Project overview and use cases
- Installation Guide - Complete installation instructions
- Quick Start - 5-minute getting started guide
- Core Concepts - Vector databases and embeddings explained
👥 For Users
- User Documentation - Comprehensive user guides
- Data Sources - Git, Confluence, JIRA, and more
- File Conversion - PDF, Office docs, images processing
- MCP Server - AI development integration
- Configuration - Complete configuration reference
🛠️ For Developers
- Developer Documentation - Architecture and contribution guides
- Architecture - System design and components
- Testing - Testing guide and best practices
- Deployment - Deployment guide and configurations
- Extending - Custom data sources and processors
📦 Package Documentation
- QDrant Loader Package - Core loader documentation
- MCP Server Package - MCP server documentation
- Website Generator - Documentation website
🤝 Contributing
We welcome contributions! Please see our Contributing Guide for details on:
- Setting up the development environment
- Code style and standards
- Pull request process
- Issue reporting guidelines
Quick Development Setup
# Clone the repository
git clone https://github.com/martin-papy/qdrant-loader.git
cd qdrant-loader
# Create virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install in development mode
pip install -e packages/qdrant-loader[dev]
pip install -e packages/qdrant-loader-mcp-server[dev]
# Run tests
pytest
🆘 Support
- Issues - Bug reports and feature requests
- Discussions - Community discussions and Q&A
- Documentation - Comprehensive guides and references
📄 License
This project is licensed under the GNU GPLv3 - see the LICENSE file for details.
Ready to supercharge your AI development workflow? Start with our Quick Start Guide or explore the complete documentation.
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










