An AI-powered agent library for answering questions about Invopop and GOBL documentation using LangChain and MCP (Model Context Protocol) servers.
It also has access to the following repositories:
- π€ Intelligent Q&A: Answers questions about Invopop and GOBL using advanced RAG
- π Multi-source Search: Searches through documentation and code repositories
- π¬ Interactive CLI: Command-line interface for direct interaction
- π§ Extensible: Easy to integrate into your own applications
- π Context Aware: Maintains conversation history and context
- Python 3.13+
- Node.js 20+ (for MCP servers)
- OpenAI API key
Note: This project uses
uv
for dependency management. If you don't haveuv
installed, you can install it withpip install uv
or usepipx
for isolated installation.
-
Install MCP servers (required for documentation search):
npx mint-mcp add invopop npx mint-mcp add gobl
-
Install the package:
# Option 1: Development installation (recommended) git clone https://github.com/invopop/expert.git cd expert uv pip install -e . # Option 2: Using pipx (installs in isolated environment) pipx install git+https://github.com/invopop/expert.git # Option 3: Direct installation with pip pip install git+https://github.com/invopop/expert.git
-
Set up environment variables:
# Copy the example environment file and edit it with your values cp .env.example .env # Edit .env file with your API keys # Then source it or restart your terminal source .env
-
Run the CLI:
# If using development installation (Option 1) uv run expert # If using pipx installation (Option 2) expert # If using pip installation (Option 3) expert # Alternative: Run directly without activation (works for all options) python -m expert.main
The CLI provides an interactive chat interface:
$ expert
Welcome to Invopop Expert! Ask questions about GOBL, Invopop and the invopop/gobl library
Enter your multi-line question. Press Enter on an empty line to send.
----------------------------------------------------------------------
π€ You: How do I create an invoice with GOBL?
π€ Thinking...
π Searching GOBL docs: {"query": "create invoice GOBL"}
π€ Assistant: To create an invoice with GOBL, you need to...
expert --help # Show help
expert --config config.yaml # Use custom config file
expert --verbose # Enable verbose output
You can also use Invopop Expert as a library in your own applications:
import asyncio
from expert import InvopopExpert, Config
async def main():
# Initialize the expert
config = Config()
expert = InvopopExpert(config)
await expert.setup()
# Ask a question
thread_config = {"configurable": {"thread_id": "my-conversation"}}
response = await expert.get_response(
"How do I handle tax calculations in GOBL?",
thread_config
)
print(response)
# Run the example
asyncio.run(main())
The agent uses a YAML configuration file (config.yaml
):
# LLM Configuration
llm:
provider: "openai"
model: "gpt-4.1-2025-04-14"
temperature: 0.1
opik:
project_name: "invopop-expert"
# MCP Server Configuration
mcp:
servers:
invopop:
command: "node"
args: ["~/.mcp/invopop/src/index.js"]
transport: "stdio"
gobl:
command: "node"
args: ["~/.mcp/gobl/src/index.js"]
transport: "stdio"
# Chat Interface Configuration
chat:
welcome_message: "Welcome to Invopop Expert!"
input_prompt: "Enter your question:"
max_history: 50
You can configure the application using environment variables. Copy the example file and customize it:
cp .env.example .env
# Edit .env with your values, then source it:
source .env
Available Variables:
OPENAI_API_KEY
(required): Your OpenAI API keyOPIK_API_KEY
(optional): Your Opik API key for conversation tracingOPIK_WORKSPACE
(optional): Your Opik workspace name for conversation tracingINVOPOP_MCP_PATH
(optional): Custom path to Invopop MCP serverGOBL_MCP_PATH
(optional): Custom path to GOBL MCP server
Invopop Expert supports optional conversation tracing with Opik for monitoring and analytics. You have two setup options:
Option 1: Environment Variables
export OPIK_API_KEY=your_opik_api_key
export OPIK_WORKSPACE=your_opik_workspace
Option 2: Interactive Configuration
opik configure
If neither option is configured, the agent will run normally but without tracing capabilities.
from fastapi import FastAPI
from expert import InvopopExpert, Config
app = FastAPI()
expert = InvopopExpert(Config())
@app.on_event("startup")
async def startup():
await expert.setup()
@app.post("/ask")
async def ask_question(question: str):
thread_config = {"configurable": {"thread_id": "api-user"}}
response = await expert.get_response(question, thread_config)
return {"answer": response}
- LangChain + LangGraph: AI agent framework with memory and tools
- MCP Protocol: Connects to Mintlify documentation servers
- Multi-source RAG: Searches Invopop docs, GOBL docs, and code repositories
- Conversation Memory: Maintains context across interactions
- Modular Design: Easy to extend with new tools and integrations
src/expert/
βββ agent.py # Core InvopopExpert agent implementation
βββ config.py # Configuration management
βββ main.py # CLI interface
βββ __init__.py # Package exports
βββ prompts/ # System prompts and tool descriptions
βββ system_prompt.md
βββ invopop_docs_description.md
βββ gobl_docs_description.md
βββ gobl_code_description.md
-
Clone the repository:
git clone https://github.com/invopop/expert.git cd expert
-
Install MCP servers:
npx mint-mcp add invopop npx mint-mcp add gobl
-
Install dependencies:
# Install uv if not already installed pip install uv # Install the package in development mode uv pip install -e .
-
Set environment variables:
# Copy and configure environment variables cp .env.example .env # Edit .env file with your API keys, then source it source .env
-
Run the CLI:
# Activate the virtual environment created by uv source .venv/bin/activate # Run the CLI expert # Or run directly without activating the venv python -m expert.main
# Run tests (when implemented)
pytest
# Run linting
ruff check src/
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Make your changes
- Add tests for new functionality
- Ensure all tests pass and linting is clean
- Commit your changes (
git commit -m 'Add amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
Command 'expert' not found:
# If using development installation, activate the virtual environment
source .venv/bin/activate
expert
# Or run directly as a Python module
python -m expert.main
# If using pipx, ensure pipx bin directory is in your PATH
export PATH="$HOME/.local/bin:$PATH"
MCP servers not found:
npx mint-mcp add invopop
npx mint-mcp add gobl
OpenAI API errors:
- Verify your API key is correct and has sufficient credits
- Check that you're using a supported model
Import errors:
- Ensure you've installed the package:
uv pip install -e .
orpip install -e .
- Check that all dependencies are installed
- Try reinstalling:
uv pip install -e . --force-reinstall
- π Invopop Documentation
- π GOBL Documentation
- π Report Issues
See LICENSE file for details.