FILEBOSS is an advanced digital evidence management system designed for legal, investigative, and forensic professionals. It provides a robust platform for collecting, processing, organizing, and analyzing digital evidence while maintaining a secure chain of custody and comprehensive timeline of events.
- Evidence Management: Track and manage digital evidence with full audit trails and chain of custody
- Automated File Processing: Extract metadata, generate hashes, and process various file types
- Timeline Integration: Link evidence to timeline events for comprehensive case analysis
- Secure Access: JWT authentication with role-based access control
- Containerized Deployment: Easy deployment with Docker and Docker Compose
- CI/CD Pipeline: Automated testing and deployment with GitHub Actions
- RESTful API: Built with FastAPI for high performance and easy integration
- Database Migrations: Alembic for database versioning and schema management
- Docker 20.10+ and Docker Compose 2.0+
- Python 3.10+ (for development)
- PostgreSQL 14+ (included in Docker)
- Redis 7+ (included in Docker)
-
Clone the repository:
git clone https://github.com/your-username/FILEBOSS.git cd FILEBOSS
-
Copy the example environment file and configure it:
cp .env.example .env # Edit .env with your configuration
-
Start the application with Docker Compose:
docker-compose up -d
-
Initialize the database:
docker-compose exec app python scripts/init_db.py
-
Create and activate a virtual environment:
python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate
-
Install dependencies:
pip install -r requirements.txt
-
Set up the database:
# Start PostgreSQL and Redis docker-compose up -d postgres redis # Run migrations alembic upgrade head # Initialize database python scripts/init_db.py
# Start the development server
uvicorn casebuilder.api.app:app --reload
# Or using the start script
python scripts/start.py
# Build and start all services
docker-compose up -d --build
# View logs
docker-compose logs -f
- API Documentation:
- Swagger UI: http://localhost:8000/api/docs
- ReDoc: http://localhost:8000/api/redoc
- PGAdmin (if enabled): http://localhost:5050
- Default credentials: admin@example.com / admin
For detailed API documentation, visit the interactive documentation at:
- Swagger UI: http://localhost:8000/api/docs
- ReDoc: http://localhost:8000/api/redoc
POST /api/v1/auth/token
- Get access token (OAuth2 password flow)POST /api/v1/auth/refresh
- Refresh access token
POST /api/v1/evidence/upload
- Upload a new evidence filePOST /api/v1/evidence/process-directory
- Process a directory of evidence filesGET /api/v1/evidence/{evidence_id}
- Get evidence detailsPUT /api/v1/evidence/{evidence_id}/status
- Update evidence statusPOST /api/v1/evidence/organize
- Organize evidence filesPOST /api/v1/evidence/{evidence_id}/link-timeline
- Link evidence to timeline event
GET /api/v1/cases
- List all casesPOST /api/v1/cases
- Create a new caseGET /api/v1/cases/{case_id}
- Get case detailsGET /api/v1/cases/{case_id}/evidence
- Get evidence for a caseGET /api/v1/cases/{case_id}/timeline
- Get timeline for a case
GET /api/v1/timeline
- List timeline eventsPOST /api/v1/timeline
- Create a new timeline eventPOST /api/v1/timeline/{event_id}/link-evidence
- Link evidence to a timeline eventGET /api/v1/timeline/search
- Search timeline events
- Python 3.10+
- Poetry (for dependency management)
- Docker and Docker Compose
-
Install Poetry (if not installed):
curl -sSL https://install.python-poetry.org | python3 -
-
Install dependencies:
poetry install
-
Set up pre-commit hooks:
pre-commit install
This project uses several tools to maintain code quality:
# Format code with Black and isort
poetry run black .
poetry run isort .
# Type checking with mypy
poetry run mypy .
# Linting with pylint
poetry run pylint casebuilder tests
# Run all checks
poetry run pre-commit run --all-files
Run the test suite:
# Run all tests
pytest
# Run tests with coverage
pytest --cov=casebuilder --cov-report=html
# Run a specific test file
pytest tests/test_evidence_processing.py -v
-
Create a new migration:
docker-compose exec app python scripts/create_migration.py -m "Your migration message"
-
Apply migrations:
docker-compose exec app alembic upgrade head
# Start all services
docker-compose up -d
# View logs
docker-compose logs -f
# Run database migrations
docker-compose exec app alembic upgrade head
# Initialize database
docker-compose exec app python scripts/init_db.py
For production deployments, we recommend using:
- Docker Swarm or Kubernetes for container orchestration
- Traefik or Nginx as reverse proxy with Let's Encrypt
- PostgreSQL with replication and backups
- Redis for caching and background tasks
- Prometheus and Grafana for monitoring
Example production docker-compose.prod.yml
:
version: "3.8"
services:
app:
image: your-registry/fileboss:latest
deploy:
replicas: 3
update_config:
parallelism: 1
delay: 10s
restart_policy:
condition: on-failure
environment:
- ENVIRONMENT=production
- DATABASE_URL=postgresql+asyncpg://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres:5432/${POSTGRES_DB}
- REDIS_URL=redis://redis:6379/0
depends_on:
- postgres
- redis
networks:
- fileboss-network
# ... other services (postgres, redis, etc.)
networks:
fileboss-network:
driver: overlay
The project includes a GitHub Actions workflow (.github/workflows/ci-cd.yml
) that:
- Runs tests on every push and pull request
- Builds and pushes Docker images to Docker Hub
- Deploys to staging when pushing to
develop
- Deploys to production when pushing to
main
Required secrets:
DOCKERHUB_USERNAME
- Docker Hub usernameDOCKERHUB_TOKEN
- Docker Hub access tokenSSH_PRIVATE_KEY
- SSH key for deploymentSTAGING_*
- Staging environment variablesPRODUCTION_*
- Production environment variables
The optional Codex Resonator plugin helps you generate "resonator scrolls" that summarize a repository and suggest future improvements.
-
(Optional) Install the OpenAI client if you want AI-generated insights:
pip install openai
-
Run the plugin:
python -m plugins.codex_resonator <path-to-repo> --output-dir ./scrolls --openai
Omit
--openai
to generate a scroll using only README content.When using the
--openai
flag, set your API key with theOPENAI_API_KEY
environment variable.
The generated scroll will appear in the chosen output directory.
This project is licensed under the MIT License - see the LICENSE file for details.
We welcome contributions! Please follow these steps:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'feat: add amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
We follow Conventional Commits:
<type>[optional scope]: <description>
[optional body]
[optional footer(s)]
Example:
feat(api): add user authentication endpoint
- Add POST /auth/login endpoint
- Add JWT token generation
- Add API documentation
Closes #123
For support, please open an issue in the GitHub issue tracker.