A KDB.AI MCP (Model Context Protocol) Server that enables interaction with KDB.AI through natural language for seamless vector database operations, vector similarity searches, hybrid search operations, and advanced data analysis.
Built on an extensible framework with configurable templates, it allows for intuitive extension with custom integrations tailored to your specific vector search and AI-powered data needs. The server leverages a combination of curated resources, intelligent prompts, and robust tools to provide appropriate guidance for AI models interacting with KDB.AI.
- Supported Environments
- Prerequisites
- Quickstart
- Features
- MCP Server Installation
- Security Considerations
- Transport Options
- Command Line Tool
- Configure Embeddings
- Usage with Claude Desktop
- Prompts/Resources/Tools
- Development
- Testing
- Troubleshooting
- Useful Resources
The following table shows the install options for supported Operating Systems:
Primary OS | KDB.AI | MCP Server | UV/NPX | Claude Desktop | Alternative MCP Client |
---|---|---|---|---|---|
Mac | ✅ Docker | ✅ Local | ✅ Local | ✅ Local (streamable-http/stdio) | ✅ Other clients |
Linux | ✅ Docker | ✅ Local | ✅ Local | ❌ Not supported | ✅ Other clients |
WSL | ✅ Docker | ✅ Local | ✅ Local | ❌ Not supported | ✅ Other clients |
Windows | ✅ Docker | ✅ Local | ✅ Local | ✅ Local (streamable-http only) | ✅ Other clients |
Windows | ✅ Docker | ✅ Local | ✅ Local (streamable-http only) | ✅ Other clients |
Before installing and running the KDB.AI MCP Server, ensure you have:
- Cloned this repo
- A suitable container runtime installed - we recommend Docker Desktop or Rancher-Desktop
- For Windows users we recommend installing Docker desktop using the WSL2 integration as outlined here
- Signed up for KDB.AI
- Follow the Server Setup Guide to get up and running quickly.
- See supporting KDB.AI documentation for additional information
- UV Installed for running the KDB.AI MCP Server - available on Windows/Mac/Linux/WSL
- Claude Desktop installed or another MCP-compatible client, that will connect to the KDB.AI MCP Server - available on Windows/Mac
- NPX installed - required to use
streamable-http
transport with Claude Desktopnpx
may not be required if you are using a different MCP Client - consult the documentation of your chosen MCP Clientnpx
comes bundled with the nodejs installer - available on Windows/Mac/Linux/WSL- See example configuration with streamable-http
To simplify getting started we recommend running your MCP Client, KDB.AI MCP server and your KDB.AI database in the same internal Network. See Security Considerations for more information.
To demonstrate basic usage of the KDB.AI MCP Server, using an empty KDB.AI database, follow the quickstart steps below.
Note: Ensure you have followed the necessary prerequisites steps
-
Start your KDB.AI Server - follow the getting started steps from KDB.AI Server setup guide
-
Test connectivity to the KDB.AI Server
If you have setup KDB.AI Server following step 1, your endpoint will be
http://localhost:8082
, otherwise update to your configured endpoint.uv run --with kdbai-client --python=3.12 python -c "import kdbai_client as kx; session = kx.Session(endpoint='http://localhost:8082'); print(session.version())"
If you see a response like the below, your KDB.AI Server is configured correctly - please proceed to the next step. If you see different version numbers that's ok.
{'serverVersion': 'latest', 'clientMinVersion': '1.7.0', 'clientMaxVersion': 'latest'}
If you get error message like
Error during creating connection...
, this usually indicates that either- KDB.AI Server is not running
- KDB.AI Server is not accepting connection
See the troubleshooting section for more details
-
Configure Claude Desktop with your chosen transport.
-
Configure Embeddings with your chosen embeddings provider and model.
-
If you have configured Claude Desktop with stdio transport, then this step is not required. Please move to the next step (Claude Desktop will manage starting the MCP Server for you).
uv run mcp-server
-
Start Claude Desktop and verify that the tools and prompts outlined in the Validate Claude Desktop Config section are visible.
-
Create some tables and add some data following the KDB.AI Quickstart Guide
-
Load kdbai_operations_guidance resource. This will give your MCP client some guidance on how to interact with your KDB.AI database.
-
Try the
kdbai_table_analysis
prompt and generate an analysis prompt for one of your tables. -
Ask questions in natural language: Interact with your KDB.AI database using plain English. Your MCP client will use one or more of the available tools to answer your questions.
- Similarity Search: Similarity Search on embedded text within vector database, based on index(s) built at the KDBAI server
- Hybrid Search: Hybrid search on sparse and dense indices built at the KDBAI server
- Customizable Query and Search Result Optimization: Customizable query and search including result truncation (query only), filtering, grouping, aggregation, sorting
- Query Guidance for LLM: Comprehensive LLM-ready MCP resource (file://kdbai_operations_guidance) with syntax examples and best practices
- Database Schema Discovery: Explore and understand your database tables and structure using the included MCP resource for quick, intelligent insights.
- Auto-Discovery System: Automatic discovery and registration of tools, resources, and prompts from their respective directories
- Ready-Made Extension Template: Ready-to-use templates for tools, resources, and prompts with best practices and documentation for extending functionality
- Unified Intelligence: Prompts, Tools & MCP Resources Working Together: A powerful combination of intelligent prompts, purpose-built tools, and curated MCP resources—all working together to deliver fast, optimized, and context-aware results.
- HTTP Streamable Protocol Support: Supports the latest MCP streamable HTTP protocol for efficient data flow, while automatically blocking the deprecated SSE protocol.
git clone https://github.com/KxSystems/kdbai-mcp-server.git
cd kdbai-mcp-server
uv sync
This step is optional, but can be useful when starting the MCP server for the first time or after adding new dependencies.
If you do not run uv sync
first, the MCP client can timeout waiting for the dependencies to be installed.
This can be caused by packages like sentence-transformers
with large dependencies.
uv run mcp-server
For more info on the supported transports see official documentation
Note: We don't support sse transport (server-sent events) as it has been deprecated since protocol version 2024-11-05.
To simplify getting started, we recommend running your MCP Client, KDB.AI MCP server, and your KDB.AI database on the same internal network.
If you require an encrypted connection between your KDB.AI MCP server and your KDB.AI database, you can enable the following options:
- QIPC with TLS: Use flag
--db.qipc-tls=true
- REST with HTTPS: Use flag
--db.rest-protocol=https
Both require setting up a TLS/HTTPS proxy (envoy, nginx) in front of KDB.AI as a prerequisite:
- Since the proxy will terminate TLS connections, we recommend that the proxy runs on the same host as your KDB.AI Server
- The proxy will need its own certificates - If you do not have your own certificates you can create self-signed certificates for internal use. See an example of creating self-signed certificates that can be used with your proxy
- For QIPC connections using self signed certificates:
- You will need to specify the location of your self signed CA cert
- Set
KX_SSL_CA_CERT_FILE
environment variable to point to the CA cert file that your proxy is using - Alternatively, you can bypass certificate verification by setting
KX_SSL_VERIFY_SERVER=NO
for development and testing
- For Kubernetes: Consider using a service mesh like istio for simplified certificate management
If you require an encrypted connection between your MCP Client and your KDB.AI MCP server:
- The KDB.AI MCP server uses
streamable-http
transport by default and starts a localhost server at127.0.0.1:7000
. We do not recommend exposing this externally. - You can optionally setup an HTTPS proxy in front of your KDB.AI MCP server such as envoy or nginx for HTTPS termination
- FastMCP v2 was evaluated for it's authentication features, but will remain temporarily on v1 to preserve broad model compatibility until clients/models catch up, at which point we’ll transition.
- When using
stdio
transport, this is not required as communication is through standard input/output streams on the same host
The KDB.AI MCP Server provides a detailed help text explaining all configuration options.
uv run mcp-server -h
usage: mcp-server [-h] [--mcp.server-name str] [--mcp.log-level {DEBUG,INFO,WARNING,ERROR,CRITICAL}]
[--mcp.transport {stdio,streamable-http}] [--mcp.port int] [--mcp.host str] [--db.host str]
[--db.port int] [--db.username str] [--db.password SecretStr] [--db.mode {rest,qipc}]
[--db.rest-protocol {http,https}] [--db.qipc-tls bool] [--db.database-name str] [--db.retry int]
[--db.k int] [--db.vector-weight float] [--db.sparse-weight float] [--db.embedding-csv-path str]
KDB.AI MCP Server that enables interaction with KDB.AI
options:
-h, --help show this help message and exit
mcp options:
MCP server configuration and transport settings
--mcp.server-name str
Name identifier for the MCP server instance [env: KDBAI_MCP_SERVER_NAME] (default:
KDBAI_MCP_Server)
--mcp.log-level {DEBUG,INFO,WARNING,ERROR,CRITICAL}
Logging verbosity level [env: KDBAI_MCP_LOG_LEVEL] (default: INFO)
--mcp.transport {stdio,streamable-http}
Communication protocol: 'stdio' (pipes) or 'streamable-http' (HTTP server) [env:
KDBAI_MCP_TRANSPORT] (default: streamable-http)
--mcp.port int HTTP server port - ignored when using stdio transport [env: KDBAI_MCP_PORT] (default: 7000)
--mcp.host str HTTP server bind address - ignored when using stdio transport [env: KDBAI_MCP_HOST] (default:
127.0.0.1)
db options:
KDB.AI database connection and search configuration
--db.host str KDB.AI server hostname or IP address [env: KDBAI_DB_HOST] (default: 127.0.0.1)
--db.port int KDB.AI server port number [env: KDBAI_DB_PORT] (default: 8082)
--db.username str Username for KDB.AI authentication [env: KDBAI_DB_USERNAME] (default: )
--db.password SecretStr
Password for KDB.AI authentication [env: KDBAI_DB_PASSWORD] (default: )
--db.mode {rest,qipc}
API mode: 'qipc' (fast binary protocol) or 'rest' (HTTP API) [env: KDBAI_DB_MODE] (default:
qipc)
--db.rest-protocol {http,https}
Select protocol for REST mode, not considered for QIPC mode [env: KDBAI_DB_REST_PROTOCOL]
(default: http)
--db.qipc-tls bool Enable TLS for QIPC mode, not considered for REST mode. When using TLS with QIPC you will need
to set the environment variable `KX_SSL_CA_CERT_FILE` that points to the certificate on your
local filesystem that your TLS proxy is using. For local development and testing you can set
`KX_SSL_VERIFY_SERVER=NO` to bypass this requirement [env: KDBAI_DB_QIPC_TLS] (default: False)
--db.database-name str
Default database name to use for operations [env: KDBAI_DB_DATABASE_NAME] (default: default)
--db.retry int Number of connection retry attempts on failure [env: KDBAI_DB_RETRY] (default: 2)
--db.k int Default number of results to return from vector searches [env: KDBAI_DB_K] (default: 5)
--db.vector-weight float
Weight for vector similarity in hybrid search (0.0-1.0) [env: KDBAI_DB_VECTOR_WEIGHT]
(default: 0.7)
--db.sparse-weight float
Weight for text similarity in hybrid search (0.0-1.0) [env: KDBAI_DB_SPARSE_WEIGHT] (default:
0.3)
--db.embedding-csv-path str
Path to embeddings csv [env: KDBAI_DB_EMBEDDING_CSV_PATH] (default:
src/mcp_server/utils/embeddings.csv)
The command line options are organized into two main categories:
- MCP Options - Controls the MCP server behavior and transport settings
- Database Options - Configures KDB.AI database connection and search behavior
For details on each option, refer to the help text
Configuration values are resolved in the following priority order:
- Command Line Arguments - Highest priority
- Environment Variables - Second priority
- .env File - Third priority
- Default Values - Default values defined in
settings.py
Every command line option has a corresponding environment variable. For example:
--mcp.port 8000
↔KDBAI_MCP_PORT=8000
--db.host localhost
↔KDBAI_DB_HOST=localhost
# Using defaults
uv run mcp-server
# Using a .env file
echo "KDBAI_MCP_PORT=8080" >> .env
echo "KDBAI_DB_RETRY=4" >> .env
uv run mcp-server
# Using environment variables
export KDBAI_MCP_PORT=8080
export KDBAI_DB_RETRY=4
uv run mcp-server
# Using command line arguments
uv run mcp-server \
--mcp.port 8080 \
--db.retry 4
Before starting the KDB.AI MCP Server, you must configure embedding models for your tables if you wish to use Similarity search. The repository includes two ready-to-use embedding providers: OpenAI and SentenceTransformers. You can customize these implementations as needed, or add your own provider by following the steps outlined below.
-
Update Dependencies - Add your required embedding providers to
pyproject.toml
dependencies section. -
Set Environment Variables - Configure required API keys for your chosen embedding providers if necessary (for example, set the environment variable
OPENAI_API_KEY
to use OpenAI's API) -
Add New Provider - The file
src/mcp_server/utils/embeddings.py
defines the base classEmbeddingProvider
for all embedding providers. To add a new provider, create a class in the same file that extends this base class and implements all required abstract methods. You can use the existing implementations of OpenAI and SentenceTransformers in the same file as templates — simply copy and modify them to suit your needs. To register your provider, use the@register_provider
decorator above your class definition. It is not compulsory for the registered provider name to follow the provider's Python package name. -
Configure Table Embeddings - Update the embeddings configuration file at
src/mcp_server/utils/embeddings.csv
with your actual database and table names, embedding providers and models. The name you provide atembeddings.csv
should match the registered provider name specified in fileembeddings.py
.
Claude Desktop requires a claude_desktop_config.json
file to be available.
Add one of the example configurations below, to the default configuration file location for your OS.
Platform | Default Configuration File Location |
---|---|
macOS | ~/Library/Application Support/Claude/claude_desktop_config.json |
Windows | %APPDATA%\Claude\claude_desktop_config.json |
To configure Claude Desktop with KDB.AI MCP Server using streamable-http
, copy the below configuration into an empty claude_desktop_config.json
file.
If you have pre-existing MCP servers see example config with multiple mcp-servers.
{
"mcpServers": {
"KDB.AI MCP streamable": {
"command": "npx",
"args": [
"mcp-remote",
"http://localhost:7000/mcp"
]
}
}
}
Note
- To use
streamable-http
with Claude Desktop you must havenpx
installed and available on your path - you can install it via nodejs.org - You will need to start the MCP Server as a standalone python process. See section Run the server
- Ensure you have the correct endpoint - in this example our KDB.AI MCP server is running on port
7000
. - This means you will be responsible for starting and stopping the MCP Server, Claude Desktop will only access it via
npx
- MCP logs will be visible from your terminal
To configure Claude Desktop with KDB.AI MCP Server using stdio
, copy the below configuration into an empty claude_desktop_config.json
file.
If you have pre-existing MCP servers see example config with multiple mcp-servers.
{
"mcpServers": {
"KDB.AI MCP stdio": {
"command": "/Users/<user>/.local/bin/uv",
"args": [
"--directory",
"/path/to/this/repo/",
"run",
"mcp-server",
"--mcp.transport",
"stdio"
]
}
}
}
Note
- Update your
<user>
to point to the absolute path of the uv executable - only required ifuv
is not on your path - Update the
--directory
path to the absolute path of this repo - Claude Desktop is responsible for starting/stopping the MCP server when using
stdio
- When using
stdio
the MCP logs will be available at Claude Desktop's MCP Log Location
You can include multiple MCP servers like this:
{
"mcpServers": {
"KDB.AI MCP streamable": {
"command": "npx",
"args": [
"mcp-remote",
"http://localhost:7000/mcp"
]
},
"Another MCP Server": {...}
}
}
For detailed setup instructions, see the official Claude Desktop documentation.
-
If you are using
streamable-http
you will need to start the MCP Server in a separate terminal window, and ensure it remains running. If you are usingstdio
skip to step 2. -
Once the
claude_desktop_config.json
has been added with your chosen transport config, restart Claude Desktop. Then navigate toFile
>Settings
>Developer
. You should see that your KDB.AI MCP Server is running.- Windows users: make sure to quit Claude Desktop via the system tray before restarting.
-
From a chat window click the
search and tools
icon just below the message box on the left. You’ll see your MCP server listed asKDB.AI MCP streamable
. Click it to access all tools. -
Click the '+' in the chat window, then select
Add from KDB.AI MCP streamable
to view the list of available prompts/resources.
Developer mode can be enabled to give quick access to:
- MCP Server Reloads - no need to quit Claude Desktop for every MCP Server restart
- MCP Configuration - shortcut to your
claude_desktop_config.json
- MCP Logs - shortcut to Claude Desktop MCP logs - when using transport
streamable-http
you will also need to review the mcp logs from your terminal
To enable Developer mode:
- Start Claude Desktop, click the menu in the upper-left corner >
Help
>Troubleshooting
>Enable Developer Mode
(Confirm any popups) - Restart Claude Desktop, click the menu in the upper-left corner >
Developer
> Developer settings should now be populated
Name | Purpose | Params | Return |
---|---|---|---|
kdbai_table_analysis | Generate a detailed analysis prompt for a specific table | table_name : Name of the KDB.AI table to analyzeanalysis_type : Type of analysis (overview, content, quality, search)sample_size : Number of records to examine |
The generated table analysis prompt |
Name | URI | Purpose | Params |
---|---|---|---|
kdbai_operations_guidance | file://kdbai_operations_guidance | Provides guidance when using KDBAI operations like query, search and hybrid search | None |
Name | Purpose | Params | Return |
---|---|---|---|
kdbai_query_data | Query data from a KDBAI table with support for filtering, sorting, grouping, limit and aggregation. | table_name : Name of the table to querydatabase_name : Name of the database containing the table (optional)filters : List of filter conditions as q/kdb+ parse treesort_columns : List of column names to sort bygroup_by : List of column names to group byaggs : Dictionary of aggregation ruleslimit : Maximum number of rows to return |
Dictionary containing query results or error message |
kdbai_similarity_search | Perform vector similarity search on a KDB.AI table. | table_name : Name of the table to searchquery : Text query to convert to vector and searchvector_index_name : Name of the vector index to search againstdatabase_name : Name of the database (optional)n : Number of results to return (optional)filters : List of filter conditionssort_columns : List of column names to sort bygroup_by : List of column names to group byaggs : Dictionary of aggregation rules |
Dictionary containing search results |
kdbai_hybrid_search | Perform hybrid search combining vector and text (sparse) search on a KDB.AI table. | table_name : Name of the table to searchquery : Text query for both vector and text searchvector_index_name : Name of the vector indexsparse_index_name : Name of the sparse indexdatabase_name : Name of the database (optional)n : Number of results to return (optional)filters : List of filter conditionssort_columns : List of column names to sort bygroup_by : List of column names to group byaggs : Dictionary of aggregation rules |
Dictionary containing hybrid search results |
kdbai_list_databases | List all database names in the KDB.AI database. | None | Dictionary with status and list of database names |
kdbai_database_info | Get KDB.AI database information including tables information. | database : Name of the database (optional, defaults to 'default') |
Dictionary with status and database information |
kdbai_all_databases_info | Get information of all databases in KDB.AI including tables information for each database. | None | Dictionary with status and information of all databases |
kdbai_session_info | Get session information from KDB.AI. | None | String containing session information and metadata |
kdbai_system_info | Get system information from KDB.AI. | None | String containing system information and metadata |
kdbai_process_info | Get process information from KDB.AI. | None | String containing process information and metadata |
kdbai_list_tables | List all tables in the given database. | database_name : Name of the database (optional, defaults to configured database) |
Dictionary with database name and list of tables |
kdbai_table_info | Get comprehensive information about a table including schema and statistics. | table_name : Name of the tabledatabase_name : Name of the database (optional, defaults to configured database) |
Dictionary with table information including name, database, disk usage, row count, schema, and indexes |
To add new tools:
- Create a new Python file in src/mcp_server/tools/.
- Implement your tool using the _template.py as a reference.
- The tool will be auto-discovered and registered when the server starts.
- Restart Claude Desktop to access your new tool.
To add new resources:
- Create a new Python file in src/mcp_server/resources/.
- Implement your resource using the _template.py as a reference.
- The resource will be auto-discovered and registered when the server starts.
- Restart Claude Desktop to access your new resource.
To add new prompts:
- Create a new Python file in src/mcp_server/prompts/.
- Implement your prompt using the _template.py as a reference.
- The prompt will be auto-discovered and registered when the server starts.
- Restart Claude Desktop to access your new prompt.
The below tools can aid in the development, testing and debugging of new MCP tools, resource and prompts.
- MCP Inspector is a interactive developer tool from Anthropic
- Postman to create MCP requests and store in collections
This can happen when running the MCP Server for the first time with stdio transport. Its recommended to run uv sync
as outlined in section Run the MCP Server
If the MCP Server port is being used by another process you will need to specify a different port or stop the service that is using the port.
This means that your KDB.AI server is either not running or not accepting connections. Refer to the quickstart section above.
Valid transports are streamable-http
and stdio
.
Review the Server logs for registration errors. The logs include a registration summary for tools, resources and prompts. You can identify failed and skipped modules to help debug the issue.
This seems to be an issue with FastMCP. It references sse
transport mode specifically but we observe the same behaviour with streamable-http
. You either need to close all open connections, or kill the mcp process.
If you see that the MCP server is disabled after a query, restart/exit claude (as described above) and try again.
Platform | Default UV Path |
---|---|
macOS | ~/.local/bin/uv |
Linux | ~/.local/bin/uv |
Windows | %APPDATA%\Python\Scripts\uv.exe |
Platform | Path | Monitor Command |
---|---|---|
macOS | ~/Library/Logs/Claude/mcp*.log |
tail -f ~/Library/Logs/Claude/mcp*.log |
Windows | %APPDATA%\Claude\Logs\mcp*.log |
Get-Content -Path "$env:APPDATA\Claude\Logs\mcp*.log" -Wait |
For detailed troubleshooting, see official Claude MCP docs.
You may need to upgrade to a paid plan to avoid Claude usage errors like this:
Claude hit the maximum length for this conversation. Please start a new conversation to continue chatting with Claude.
- KDB.AI documentation for more information about KDB.AI
- KX Forum for community support
- KX Slack for support & feedback