π Release Notes v0.4.7 - Latest improvements and bug fixes (June 9, 2025)
A comprehensive toolkit for loading data into Qdrant vector database with advanced MCP server support for AI-powered development workflows.
QDrant Loader is a powerful data ingestion and retrieval system that bridges the gap between your technical content and AI development tools. It collects, processes, and vectorizes content from multiple sources, then provides intelligent search capabilities through a Model Context Protocol (MCP) server.
Perfect for:
- π€ AI-powered development with Cursor, Windsurf, and GitHub Copilot
- π Knowledge base creation from scattered documentation
- π Intelligent code assistance with contextual documentation
- π’ Enterprise content integration from Confluence, JIRA, and Git repositories
This monorepo contains two complementary packages:
π QDrant Loader
Data ingestion and processing engine
Collects and vectorizes content from multiple sources into QDrant vector database.
Key Features:
- Multi-source connectors: Git, Confluence (Cloud & Data Center), JIRA (Cloud & Data Center), Public Docs, Local Files
- Advanced file conversion: 20+ file types including PDF, Office docs, images with AI-powered processing
- Intelligent chunking: Smart document processing with metadata extraction
- Incremental updates: Change detection and efficient synchronization
- Flexible embeddings: OpenAI, local models, and custom endpoints
AI development integration layer
Model Context Protocol server providing RAG capabilities to AI development tools.
Key Features:
- MCP protocol compliance: Full integration with Cursor, Windsurf, and Claude Desktop
- Advanced search tools: Semantic, hierarchy-aware, and attachment-focused search
- Confluence intelligence: Deep understanding of page hierarchies and relationships
- File attachment support: Comprehensive attachment discovery with parent document context
- Real-time processing: Streaming responses for large result sets
# Install both packages
pip install qdrant-loader qdrant-loader-mcp-server
# Or install individually
pip install qdrant-loader # Data ingestion only
pip install qdrant-loader-mcp-server # MCP server only
-
Create a workspace
mkdir my-qdrant-workspace && cd my-qdrant-workspace
-
Download configuration templates
curl -o config.yaml https://raw.githubusercontent.com/martin-papy/qdrant-loader/main/packages/qdrant-loader/conf/config.template.yaml curl -o .env https://raw.githubusercontent.com/martin-papy/qdrant-loader/main/packages/qdrant-loader/conf/.env.template
-
Configure your environment (edit
.env
)QDRANT_URL=http://localhost:6333 QDRANT_COLLECTION_NAME=my_docs OPENAI_API_KEY=your_openai_key
-
Configure data sources (edit
config.yaml
)sources: git: - url: "https://github.com/your-org/your-repo.git" branch: "main"
-
Load your data
qdrant-loader --workspace . init qdrant-loader --workspace . ingest
-
Start the MCP server
mcp-qdrant-loader
π You're ready! Your content is now searchable through AI development tools.
Add to .cursor/mcp.json
:
{
"mcpServers": {
"qdrant-loader": {
"command": "/path/to/venv/bin/mcp-qdrant-loader",
"env": {
"QDRANT_URL": "http://localhost:6333",
"QDRANT_COLLECTION_NAME": "my_docs",
"OPENAI_API_KEY": "your_key",
"MCP_DISABLE_CONSOLE_LOGGING": "true"
}
}
}
}
- "Find documentation about authentication in our API"
- "Show me examples of error handling patterns"
- "What are the deployment requirements for this service?"
- "Find all attachments related to database schema"
qdrant-loader/
βββ packages/
β βββ qdrant-loader/ # Core data ingestion package
β βββ qdrant-loader-mcp-server/ # MCP server for AI integration
βββ docs/ # Comprehensive documentation
βββ website/ # Documentation website generator
βββ README.md # This file
- What is QDrant Loader? - Project overview and use cases
- Installation Guide - Complete installation instructions
- Quick Start - 5-minute getting started guide
- Core Concepts - Vector databases and embeddings explained
- User Documentation - Comprehensive user guides
- Data Sources - Git, Confluence, JIRA, and more
- File Conversion - PDF, Office docs, images processing
- MCP Server - AI development integration
- Configuration - Complete configuration reference
- Developer Documentation - Architecture and contribution guides
- Architecture - System design and components
- Testing - Testing guide and best practices
- Deployment - Deployment guide and configurations
- Extending - Custom data sources and processors
- QDrant Loader Package - Core loader documentation
- MCP Server Package - MCP server documentation
- Website Generator - Documentation website
We welcome contributions! Please see our Contributing Guide for details on:
- Setting up the development environment
- Code style and standards
- Pull request process
- Issue reporting guidelines
# Clone the repository
git clone https://github.com/martin-papy/qdrant-loader.git
cd qdrant-loader
# Create virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install in development mode
pip install -e packages/qdrant-loader[dev]
pip install -e packages/qdrant-loader-mcp-server[dev]
# Run tests
pytest
- Issues - Bug reports and feature requests
- Discussions - Community discussions and Q&A
- Documentation - Comprehensive guides and references
This project is licensed under the GNU GPLv3 - see the LICENSE file for details.
Ready to supercharge your AI development workflow? Start with our Quick Start Guide or explore the complete documentation.