Turbular is an open-source Model Context Protocol (MCP) server that enables seamless database connectivity for Language Models (LLMs). It provides a unified API interface to interact with various database types, making it perfect for AI applications that need to work with multiple data sources.
- 🔌 Multi-Database Support: Connect to various database types through a single API
- 🔄 Schema Normalization: Automatically normalize database schemas to correct naming conventions for LLM compatibility
- 🔒 Secure Connections: Support for SSL and various authentication methods
- 🚀 High Performance: Optimizes your LLM generated queries
- 📝 Query Transformation: Let LLM generate queries against normalized layouts and transform them into their unnormalized form
- 🐳 Docker Support: Easy deployment with Docker and Docker Compose
- 🔧 Easy to Extend: Adding new database providers can be easily done by extending the BaseDBConnector interface
Database Type | Status | Icon |
---|---|---|
PostgreSQL | ✅ | |
MySQL | ✅ | |
SQLite | ✅ | |
BigQuery | ✅ | |
Oracle | ✅ | |
MS SQL | ✅ | |
Redshift | ✅ |
-
Clone the repository:
git clone https://github.com/raeudigerRaeffi/turbular.git cd turbular
-
Start the development environment:
docker-compose -f docker-compose.dev.yml up --build
-
Test the connection:
./scripts/test_connection.py
-
Install Python 3.11 or higher
-
Install dependencies:
pip install -r requirements.txt
-
Run the server:
uvicorn app.main:app --reload
POST /get_schema
Retrieve the schema of a connected database for your LLM agent.
Parameters:
db_info
: Database connection argumentsreturn_normalize_schema
(optional): Return schema in LLM-friendly format
POST /execute_query
Optimizes query and then execute SQL queries on the connected database.
Parameters:
db_info
: Database connection argumentsquery
: SQL query stringnormalized_query
: Boolean indicating if query is normalizedmax_rows
: Maximum number of rows to returnautocommit
: Boolean for autocommit mode
POST /upload-bigquery-key
Upload a BigQuery service account key file.
Parameters:
project_id
: BigQuery project IDkey_file
: JSON key file
POST /upload-sqlite-file
Upload a SQLite database file.
Parameters:
database_name
: Name to identify the databasedb_file
: SQLite database file (.db or .sqlite)
GET /health
Verify if the API is running.
GET /supported-databases
Get a list of all supported database types.
-
Fork and clone the repository
-
Create a development environment:
docker-compose -f docker-compose.dev.yml up --build
-
The development server includes:
- FastAPI server with hot reload
- PostgreSQL test database
- Pre-configured test data
-
Access the API documentation:
- Swagger UI: http://localhost:8000/docs
- ReDoc: http://localhost:8000/redoc
We welcome contributions! Here's how you can help:
- Check out our contribution guidelines
- Look for open issues
- Submit pull requests with improvements
- Help with documentation
- Share your feedback
- Follow PEP 8 style guide
- Write tests for new features
- Update documentation as needed
- Use meaningful commit messages
- Add more testing, formatting and commit hooks
- Add SSH support for database connection
- Add APIs as datasources using steampipe
- Enable local schema saving for databases to which the server has already connected
- Add more datasources (snowflake, mongodb, excel, etc.)
- Add authentication protection to routes
Run the test suite:
pytest
For development tests with the included PostgreSQL:
./scripts/test_connection.py
connection_info = {
"database_type": "PostgreSQL",
"username": "user",
"password": "password",
"host": "localhost",
"port": 5432,
"database_name": "mydb",
"ssl": False
}
connection_info = {
"database_type": "BigQuery",
"path_cred": "/path/to/credentials.json",
"project_id": "my-project",
"dataset_id": "my_dataset"
}
connection_info = {
"type": "SQLite",
"database_name": "my_database"
}
- FastAPI for the amazing framework
- SQLAlchemy for database support
- @henryclickclack Henry Albert Jupiter Hommel as Co-Developer ❤️
- All our contributors and users
- Create an issue
- Email: raffael@turbular.com