A modern, full-featured Flask-based chatbot application providing an intuitive web interface for interacting with Letta AI agents. Built with Flask, HTMX, and Tailwind CSS for a responsive, real-time chat experience.
- Multi-Agent Support - Create, manage, and chat with multiple AI agents simultaneously
- Real-Time Chat - Interactive chat interface powered by HTMX for seamless updates
- Persistent Memory - Agents maintain conversation history and context across sessions
- Archival Memory - Long-term semantic memory storage and retrieval
- Identity Management - Per-user conversation contexts with isolated memory blocks
- Agent Customization - Configure agent personas, memory blocks, and behavior
- Responsive Design - Mobile-first design that works beautifully on all devices
- Dark Mode Support - Built-in dark mode for comfortable viewing
- Session Management - Cookie-based user sessions for multi-user deployments
- Performance Optimized - Caching, rate limiting, and efficient API calls
- Security Hardened - XSS protection, CSRF tokens, input validation
- Production Ready - Comprehensive error handling and logging
- Quick Start
- Prerequisites
- Installation
- Configuration
- Usage
- Architecture
- API Reference
- Development
- Testing
- Deployment
- Contributing
- License
- Python 3.8+ - Modern Python with asyncio support
- Letta Server - Running instance of Letta server (local or cloud)
- Git - For cloning the repository
-
Clone the repository
git clone https://github.com/actuallyrizzn/letta-web.git cd letta-web -
Create virtual environment
python -m venv venv # On Windows venv\Scripts\activate # On Linux/Mac source venv/bin/activate
-
Install dependencies
pip install -r requirements.txt
-
Configure environment
cp env.example .env
Edit
.envand set your configuration:LETTA_BASE_URL=https://your-letta-server.com:8283 LETTA_API_KEY=your_api_key_here FLASK_SECRET_KEY=your_secret_key_here USE_COOKIE_BASED_AUTHENTICATION=true
-
Run the application
python wsgi.py
-
Open in browser
http://localhost:5000
Create a .env file in the project root with the following variables:
| Variable | Description | Default | Required |
|---|---|---|---|
LETTA_BASE_URL |
URL of your Letta server | http://localhost:8283 |
Yes |
LETTA_API_KEY |
Your Letta API authentication key | DEFAULT_TOKEN |
Yes |
FLASK_SECRET_KEY |
Secret key for Flask sessions | Random | Yes (Production) |
USE_COOKIE_BASED_AUTHENTICATION |
Enable multi-user sessions | true |
No |
CREATE_AGENTS_FROM_UI |
Allow agent creation from UI | true |
No |
FLASK_ENV |
Environment mode | development |
No |
Edit default-agent.json to customize default agent settings:
{
"DEFAULT_MEMORY_BLOCKS": [
{
"label": "human",
"value": "The human's name is [User Name]"
},
{
"label": "persona",
"value": "My name is Sam, the all-knowing sentient AI."
}
],
"DEFAULT_LLM": "letta/letta-free",
"DEFAULT_EMBEDDING": "letta/letta-free"
}The application supports any model available in your Letta installation:
- Letta Cloud:
letta/letta-free,letta/letta-pro - OpenAI:
gpt-4,gpt-3.5-turbo - Anthropic:
claude-3-opus,claude-3-sonnet - Local Models: Any model via Ollama, LM Studio, etc.
- Click the "+" button in the sidebar
- Agent is created with default configuration
- Click on the agent to start chatting
- Hover over an agent in the sidebar
- Click the edit icon (pencil)
- Modify the agent's name, system prompt, or memory blocks
- Click "Save Changes"
- Select an agent from the sidebar
- Type your message in the input box
- Press Enter or click Send
- View the agent's response in real-time
Core Memory: Always-loaded context (persona, human info)
- Edit via the agent edit modal
- Limited size for efficiency
Archival Memory: Long-term semantic storage
- View via the "Archive" button
- Search and retrieve past information
- Automatically accessed by agent
- Hover over an agent in the sidebar
- Click the options menu (three dots)
- Select "Delete"
- Confirm deletion
app/
βββ __init__.py # Flask application factory
βββ config.py # Configuration classes
βββ routes/ # API endpoints
β βββ agents.py # Agent CRUD operations
β βββ messages.py # Message handling
β βββ runtime.py # Runtime configuration
β βββ frontend.py # Page rendering
βββ templates/ # Jinja2 templates
β βββ base.html # Base layout
β βββ index.html # Main chat page
β βββ components/ # Reusable components
βββ utils/ # Utility modules
β βββ letta_client.py # Letta API wrapper
β βββ session_manager.py # User session handling
β βββ validators.py # Message filtering
β βββ error_handler.py # Error handling
β βββ performance.py # Caching & rate limiting
βββ static/ # Static assets
- HTMX: Handles real-time updates without full page reloads
- Tailwind CSS: Utility-first styling with dark mode support
- Alpine.js: Lightweight JavaScript for interactivity
- Server-Side Rendering: Fast initial load, SEO-friendly
- HTMX over React: Simpler architecture, less JavaScript
- Cookie-based sessions: Stateless authentication for multi-user
- Component-based templates: Reusable, maintainable UI
- Error boundaries: Graceful degradation on failures
- Rate limiting: Prevent abuse and API overload
See docs/API.md for complete API documentation.
| Endpoint | Method | Description |
|---|---|---|
/ |
GET | Main chat interface |
/<agent_id> |
GET | Chat with specific agent |
/api/agents |
GET | List user's agents |
/api/agents |
POST | Create new agent |
/api/agents/<id> |
GET | Get agent details |
/api/agents/<id> |
PUT | Update agent |
/api/agents/<id> |
DELETE | Delete agent |
/api/agents/<id>/messages |
GET | Get agent messages |
/api/agents/<id>/messages |
POST | Send message |
/api/runtime |
GET | Get configuration |
# Install development dependencies
pip install -r requirements.txt
# Run in development mode
export FLASK_ENV=development
python wsgi.py- Blueprints: Organized into logical route groups
- Factory Pattern: Application created via
create_app() - Dependency Injection: Configuration passed to blueprints
- Error Handling: Centralized error handlers
- Logging: Comprehensive logging throughout
- Create route function in appropriate blueprint
- Add route decorator with path and methods
- Implement business logic
- Return JSON or rendered template
- Add error handling
Example:
@agents_bp.route('/agents/<agent_id>/archive', methods=['GET'])
@handle_api_error
def get_agent_archive(agent_id):
client = LettaClient()
archive = client.get_archival_memory(agent_id)
return jsonify(archive)- Create template in
app/templates/components/ - Include component in parent template
- Add HTMX attributes for interactivity
- Style with Tailwind classes
# Run all tests
python run_tests.py --mode all
# Run quick tests (unit only)
python run_tests.py --mode quick
# Run specific test suites
python run_tests.py --mode unit
python run_tests.py --mode integration
python run_tests.py --mode e2e
# Run tests in parallel
python run_tests.py --mode parallel
# Generate coverage report
python run_tests.py --mode all
# View htmlcov/index.html- Unit Tests: Test individual functions and classes
- Integration Tests: Test component interactions
- E2E Tests: Test full user workflows
- Performance Tests: Test speed and efficiency
- Security Tests: Test for vulnerabilities
import pytest
from app import create_app
@pytest.fixture
def client():
app = create_app('testing')
with app.test_client() as client:
yield client
def test_create_agent(client):
response = client.post('/api/agents', json={
'name': 'Test Agent'
})
assert response.status_code == 200See docs/DEPLOYMENT.md for detailed deployment instructions.
# Set production environment
export FLASK_ENV=production
export FLASK_SECRET_KEY="your-secure-secret-key"
# Run with Gunicorn
gunicorn -w 4 -b 0.0.0.0:5000 wsgi:app
# Or with uWSGI
uwsgi --http :5000 --wsgi-file wsgi.py --callable app# Build image
docker build -t letta-chatbot .
# Run container
docker run -d -p 5000:5000 \
-e LETTA_BASE_URL="https://your-server.com:8283" \
-e LETTA_API_KEY="your_key" \
letta-chatbotserver {
listen 80;
server_name your-domain.com;
location / {
proxy_pass http://127.0.0.1:5000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
}
}We welcome contributions! See CONTRIBUTING.md for guidelines.
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Make your changes
- Add tests for new functionality
- Run tests (
python run_tests.py --mode all) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
- Follow PEP 8 for Python code
- Use Black for code formatting
- Write docstrings for all functions
- Add type hints where appropriate
- Keep functions small and focused
This project is dual-licensed:
- Code (Python, JavaScript, etc.): GNU Affero General Public License v3.0
- Documentation (Markdown, text files, etc.): Creative Commons Attribution-ShareAlike 4.0
Copyright (C) 2025 Mark Hopkins
- Letta AI - For the amazing AI agent framework
- Flask - The web framework
- HTMX - For making interactive web apps simple
- Tailwind CSS - For beautiful styling
- All contributors and users of this project
- Documentation: docs/
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- Letta Discord: Join the community
- WebSocket support for real-time streaming
- File upload and document chat
- Multi-modal support (images, audio)
- Agent templates and presets
- Conversation export/import
- Team collaboration features
- Advanced analytics dashboard
- Plugin system for extensions
Built with β€οΈ by the Letta community
