A system for managing, tracking, and resolving problems with semantic similarity features, entity resolution, and graph relationships.
The system consists of:
-
Backend Implementation
- Graph database (Neo4j) for problem storage
- Embedding-based similarity features
- Entity resolution system
-
Chat Interface
- Conversational UI for interacting with the system
- OpenAI Assistant integration
- Function calling to manage problems
- Conda package manager
- Neo4j database (5.11+ recommended for vector search)
- OpenAI API key
- Node.js and npm (for the UI)
-
Clone the repository:
git clone https://github.com/yourusername/matters.global.git cd matters.global
-
Create the conda environment from the environment.yml file:
conda env create -f environment.yml
-
Activate the environment:
conda activate mattersglobal
-
Set up environment variables:
export OPENAI_API_KEY="your_openai_api_key" export NEO4J_URI="bolt://localhost:7687" export NEO4J_USERNAME="neo4j" export NEO4J_PASSWORD="password"
-
Quick start (automated setup and launch):
./start_new.sh
This script will check dependencies, start the WebSocket server, and launch the UI.
-
Install and start Neo4j:
- Download from Neo4j website
- Or use Docker:
docker run -p 7474:7474 -p 7687:7687 neo4j:5.11
-
Create a new database and set password
-
Ensure the database is accessible at the URI specified in your environment variables
You have two server options:
-
Start the WebSocket server:
python websocket_server.py
The server will start on port 8090 by default.
-
You should see output confirming:
- Connection to Neo4j
- Creation or retrieval of the OpenAI Assistant
- WebSocket server running and ready for connections
-
Start the Flask server:
python server.py
The server will start on port 5000 by default.
-
You should see output confirming:
- Connection to Neo4j
- Creation or retrieval of the OpenAI Assistant
- REST API server running and ready for connections
-
The chatbot UI is included in the
ui
directory:cd ui
-
Install dependencies:
npm install
-
The UI is already configured to connect to the WebSocket server at
ws://localhost:8091
. If you need to modify this:- Edit
src/config.ts
to change theWEBSOCKET_ENDPOINT
value
- Edit
-
Start the UI development server:
npm run dev
-
Access the chat interface at:
http://localhost:5173
orhttp://localhost:3000
(depending on Vite configuration)
The server exposes these main endpoints:
POST /api/chat/message
- Send a message to the assistantGET /api/chat/history
- Get message history for a sessionPOST /api/chat/reset
- Reset or create a new sessionGET /api/health
- Health check endpoint
Variable | Description | Default |
---|---|---|
OPENAI_API_KEY |
Your OpenAI API key | None (Required) |
OPENAI_ASSISTANT_ID |
Optional existing assistant ID | None (Creates new) |
NEO4J_URI |
Neo4j connection URI | bolt://localhost:7687 |
NEO4J_USERNAME |
Neo4j username | neo4j |
NEO4J_PASSWORD |
Neo4j password | password |
PORT |
Server port | 5000 |
FLASK_SECRET_KEY |
Flask session secret | Random UUID |
To modify the assistant's behavior, edit the SYSTEM_MESSAGE
in assistant_manager.py
.
OpenAI function schemas are defined in assistant_functions.py
. You can add new functions by:
- Adding a function schema to the
FUNCTION_DEFINITIONS
list - Implementing a handler function
- Adding the handler to the
FUNCTION_DISPATCH
dictionary
The system uses Neo4j relationships to model connections between problems:
(Problem)-[:REQUIRES]->(Condition)
: A problem requires a condition to be met(Problem)-[:MUST_BE_RESOLVED_BEFORE]->(Problem)
: Problem A must be resolved before Problem B can be solved(Problem)-[:SOLVED_BY]->(Solution)
: A problem is solved by a solution(Problem)-[:MAPPED_TO]->(CanonicalProblem)
: Maps problem variants to canonical form(Condition)-[:MAPPED_TO]->(CanonicalCondition)
: Maps condition variants to canonical form
The MUST_BE_RESOLVED_BEFORE
relationship models a clear sequential dependency between problems. For example, if Problem A must be resolved before Problem B, there is a direct relationship: (A)-[:MUST_BE_RESOLVED_BEFORE]->(B)
.
See the BRAINSTORMING.md file for plans on future development, including:
- User interfaces
- Visualization tools
- Monitoring dashboard
- Authentication system