-
Notifications
You must be signed in to change notification settings - Fork 147
Description
Problem
The application currently relies on hardcoded(?) endpoints for LLM and diffusion model sources, making it inflexible and limited in terms of backend model providers.
Issue
Users are unable to reconfigure the application to point to alternative LLM or diffusion model backends (e.g., local models or self-hosted APIs). This design prevents the use of local models and restricts the app to a single model provider, such as OpenAI.
Proposed Solution
Introduce a configurable setting inside the app that allows users to specify a custom API endpoint (compatible with OpenAI-like APIs). The application could query the /models endpoint (or equivalent) to dynamically fetch and list the available models. This would enable support for local models or other third-party APIs without code modification.
Image generation from alternative diffusion models (perhaps via a comfyui instance) is of less importance, but would be useful too.