Skip to content

Dino-Kupinic/blackrose

Repository files navigation

Blackrose

Caution

Blackrose is still in Development. You will find bugs and broken/unfinished features.

🌟 Overview

Blackrose is a backend for AI-powered applications.

🚀 Installation and Configuration

Prerequisites

(Also available via nix-shell)

  • Python 3.12
  • pip
  • git

Installation for Development

  1. Clone the repository
git clone https://github.com/Dino-Kupinic/ai-backend.git
  1. [Optional] Using a python virtual environment for a local installation
python3 -m venv venv

Activate venv: (example for *nix systems)

source ./venv/bin/activate
  1. Install dependencies
poetry install
  1. Create a .env file in the root directory and copy over the fields from the .env.example file.

  2. Download ollama for your system from here.

Note

Can be skipped if you use nix-shell.

Note

In the future, ollama will be downloaded from the command line automatically.

  1. Start Ollama and Pull the model
ollama serve
ollama pull llama3
  1. Run the server
fastapi dev src/main.py

📖 Documentation

OpenAPI Documentation

The OpenAPI documentation is available at /docs. It is automatically generated from the code.

Configuration

// WIP

Usage

curl -X POST "http://localhost:8000/message/" -H "Content-Type: application/json" -d '{"prompt": "Tell me something about Vienna, Austria", "model": "llama3"}' --no-buffer

Tip

--no-buffer is needed due to streaming.

// WIP

🧪 Testing

To run the test suite:

  1. Ensure that both the AI Backend and Ollama services are running.
  2. Execute the following command:
pytest

This will run all tests in the tests/ directory.

📝 Contributing

// WIP

📚 Resources

📄 License

This project is licensed under the MIT License, see the LICENSE file for details.


For more information, please open an issue or contact the maintainers.

About

No description or website provided.

Topics

Resources

License

Stars

Watchers

Forks

Sponsor this project

Packages

No packages published

Contributors 3

  •  
  •  
  •