A Model Context Protocol (MCP) server for Factifai integration with any MCP-compatible AI tool. This server is designed to be tool-agnostic, meaning it can be used with any tool that supports the MCP protocol. This server currently exposes tools to create tests asynchronously and get the result of the test.
- Factifai MCP Server
- Node.js >= 16.0.0
- Hai Build, Cursor, Windsurf, Claude Desktop or any MCP Client
# Latest version
npx --yes @presidio-dev/factifai-mcp-server@latest
# Specific version
npx --yes @presidio-dev/factifai-mcp-server@1.2.3
We recommend npx
to install the server, but you can use any node package manager of your preference such as yarn
, pnpm
, bun
, etc.
The installation includes:
- Downloading browser binaries (Chromium, Firefox, WebKit)
- Installing browser dependencies
- Setting up the necessary environment
This happens only once, and subsequent runs will be much faster as the browsers are already installed.
To avoid timeout issues, we strongly recommend pre-installing Playwright browsers manually:
# Step 1: Install Playwright browsers manually before installing the MCP server
npx playwright install --with-deps
# Step 2: Then install the MCP server (will be much faster and avoid timeouts)
npx --yes @presidio-dev/factifai-mcp-server@latest
This pre-installation step:
- Ensures browsers are downloaded without MCP client timeout constraints
- Significantly speeds up the MCP server's first-time installation
- Prevents installation failures due to timeout issues in your IDE or MCP client
with npx
with latest version:
{
"factifai": {
"command": "npx",
"args": ["--yes", "@presidio-dev/factifai-mcp-server@latest"],
"env": {
"MODEL_PROVIDER": "bedrock|openai",
"OPENAI_API_KEY": "<your-openai-api-key>",
"AWS_ACCESS_KEY_ID": "<your-aws-access-key-id>",
"AWS_SECRET_ACCESS_KEY": "<your-aws-secret-access-key>",
"AWS_DEFAULT_REGION": "<your-aws-region>"
},
"disabled": false,
"autoApprove": []
}
}
with npx
with specific version:
{
"factifai": {
"command": "npx",
"args": ["--yes", "@presidio-dev/factifai-mcp-server@1.2.3"],
"env": {
"MODEL_PROVIDER": "bedrock|openai",
"OPENAI_API_KEY": "<your-openai-api-key>",
"AWS_ACCESS_KEY_ID": "<your-aws-access-key-id>",
"AWS_SECRET_ACCESS_KEY": "<your-aws-secret-access-key>",
"AWS_DEFAULT_REGION": "<your-aws-region>"
},
"disabled": false,
"autoApprove": []
}
}
Variable Name | Description |
---|---|
MODEL_PROVIDER |
The model provider to use. (bedrock or openai) |
OPENAI_API_KEY |
The API key for the OpenAI model provider |
AWS_ACCESS_KEY_ID |
The AWS access key ID for the Bedrock model provider |
AWS_SECRET_ACCESS_KEY |
The AWS secret access key for the Bedrock model provider |
AWS_DEFAULT_REGION |
The AWS default region for the Bedrock model provider |
{
"factifai": {
"command": "npx",
"args": ["--yes", "@presidio-dev/factifai-mcp-server@latest"],
"env": {
"MODEL_PROVIDER": "bedrock",
"AWS_ACCESS_KEY_ID": "<your-aws-access-key-id>",
"AWS_SECRET_ACCESS_KEY": "<your-aws-secret-access-key>",
"AWS_DEFAULT_REGION": "<your-aws-region>"
},
"disabled": false,
"autoApprove": []
}
}
{
"factifai": {
"command": "npx",
"args": ["--yes", "@presidio-dev/factifai-mcp-server@latest"],
"env": {
"MODEL_PROVIDER": "openai",
"OPENAI_API_KEY": "<your-openai-api-key>"
},
"disabled": false,
"autoApprove": []
}
}
See the setup instructions for each
Install in Hai Build
Add the following to your hai_mcp_settings.json
file. To open this file from Hai Build, click the "MCP Servers" icon, select the "Installed" tab, and then click "Configure MCP Servers".
See the Hai Build MCP documentation for more info.
{
"mcpServers": {
"factifai": {
"command": "npx",
"args": ["-y", "@presidio-dev/factifai-mcp-server@latest"],
"env": {
"MODEL_PROVIDER": "bedrock|openai",
"OPENAI_API_KEY": "<your-openai-api-key>",
"AWS_ACCESS_KEY_ID": "<your-aws-access-key-id>",
"AWS_SECRET_ACCESS_KEY": "<your-aws-secret-access-key>",
"AWS_DEFAULT_REGION": "<your-aws-region>"
}
}
}
}
Install in Amazon Q Developer
Add the following to your Amazon Q Developer configuration file. See MCP configuration for Q Developer in the IDE for more details.
The configuration file can be stored globally at ~/.aws/amazonq/mcp.json
to be available across all your projects, or locally within your project at .amazonq/mcp.json
.
{
"mcpServers": {
"factifai": {
"command": "npx",
"args": ["-y", "@presidio-dev/factifai-mcp-server@latest"],
"env": {
"MODEL_PROVIDER": "bedrock|openai",
"OPENAI_API_KEY": "<your-openai-api-key>",
"AWS_ACCESS_KEY_ID": "<your-aws-access-key-id>",
"AWS_SECRET_ACCESS_KEY": "<your-aws-secret-access-key>",
"AWS_DEFAULT_REGION": "<your-aws-region>"
}
}
}
}
Install in VS Code (Copilot)
First, enable MCP support in VS Code by opening Settings (Ctrl+,
), searching for mcp.enabled
, and checking the box.
Then, add the following configuration to your user or workspace settings.json
file. See the VS Code MCP documentation for more info.
"mcp": {
"servers": {
"factifai": {
"type": "stdio",
"command": "npx",
"args": ["-y", "@presidio-dev/factifai-mcp-server@latest"],
"env": {
"MODEL_PROVIDER": "bedrock|openai",
"OPENAI_API_KEY": "<your-openai-api-key>",
"AWS_ACCESS_KEY_ID": "<your-aws-access-key-id>",
"AWS_SECRET_ACCESS_KEY": "<your-aws-secret-access-key>",
"AWS_DEFAULT_REGION": "<your-aws-region>"
}
}
}
}
Install in Cursor
The easiest way to install is with the one-click installation button below.
Alternatively, you can manually configure the server by adding the following to your mcp.json
file. This file can be located globally at ~/.cursor/mcp.json
or within a specific project at .cursor/mcp.json
. See the Cursor MCP documentation for more information.
{
"mcpServers": {
"factifai": {
"command": "npx",
"args": ["--yes", "@presidio-dev/factifai-mcp-server@latest"],
"env": {
"MODEL_PROVIDER": "bedrock|openai",
"OPENAI_API_KEY": "<your-openai-api-key>",
"AWS_ACCESS_KEY_ID": "<your-aws-access-key-id>",
"AWS_SECRET_ACCESS_KEY": "<your-aws-secret-access-key>",
"AWS_DEFAULT_REGION": "<your-aws-region>"
}
}
}
}
Install in Windsurf
Add the following to your ~/.codeium/windsurf/mcp_config.json
file. See the Windsurf MCP documentation for more information.
{
"mcpServers": {
"factifai": {
"command": "npx",
"args": ["-y", "@presidio-dev/factifai-mcp-server@latest"],
"env": {
"MODEL_PROVIDER": "bedrock|openai",
"OPENAI_API_KEY": "<your-openai-api-key>",
"AWS_ACCESS_KEY_ID": "<your-aws-access-key-id>",
"AWS_SECRET_ACCESS_KEY": "<your-aws-secret-access-key>",
"AWS_DEFAULT_REGION": "<your-aws-region>"
}
}
}
}
Install in Zed
You can add the Factifai MCP server in Zed by editing your settings.json
file (accessible via the zed: settings
action) or by using the Agent Panel's configuration UI (agent: open configuration
). See the Zed MCP documentation for more information.
Add the following to your settings.json
:
{
"context_servers": {
"factifai": {
"command": {
"path": "npx",
"args": ["-y", "@presidio-dev/factifai-mcp-server@latest"],
"env": {
"MODEL_PROVIDER": "bedrock|openai",
"OPENAI_API_KEY": "<your-openai-api-key>",
"AWS_ACCESS_KEY_ID": "<your-aws-access-key-id>",
"AWS_SECRET_ACCESS_KEY": "<your-aws-secret-access-key>",
"AWS_DEFAULT_REGION": "<your-aws-region>"
}
}
}
}
}
Tool Name | Description |
---|---|
testWithFactifai |
Start a test with Factifai |
getFactifaiSessionResult |
Get test result |
listFactifaiSessions |
List tests |
We welcome contributions to the Factifai MCP Server! Please see our Contributing Guide for more information on how to get started.
For information about our security policy and how to report security vulnerabilities, please see our Security Policy.
This project is licensed under the MIT License - see the LICENSE file for details.