Skip to content

Feature Request: Consume models via API in the GUI #5726

@qdrddr

Description

@qdrddr

Is your feature request related to a problem? Please describe.
As a UI application user, I find it cumbersome to switch between different platforms to use both local and proprietary models. This fragmentation makes it difficult to efficiently compare and test various models.

Describe the solution you'd like
I would like the ability to use the chat in the UI with both local and proprietary models available via API, such as OpenAI, OpenRouter, etc. This integration would allow seamless switching between locally running LLM models and SaaS models within the same interface.

Describe alternatives you've considered
One alternative I've considered is using separate applications for local and proprietary models, but this approach is inefficient and time-consuming. Another alternative is 3rd party app that already has UI and can consume Proprietary SaaS models and LocalAI via API, but this requires additional technical expertise and maintenance.

Additional context
Having a unified interface in the LocalAI UI for both local and proprietary models would significantly enhance productivity and ease of use. It would streamline the workflow for users who need to compare and test different models regularly.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions