Skip to content

[IDEA]: Self-Hosted AI Chat Responses (Ollama) #1082

@cotton105

Description

@cotton105

In the spirit of self-hosted tools, it would be great if the /chat command could support offline AI text generation, for example with a local Ollama instance.

At the moment, it currently seems restricted to using online tools that use an API token (OpenAI, Gemini, Anthropic). If it was possible to query a locally hosted alternative, it would help to keep everything self-contained, and give the Bot Hoster the freedom to choose whichever model they want.

I've done a small amount of research and found an Ollama API for Node, which might help an implementation on Bastion.

Metadata

Metadata

Assignees

No one assigned

    Labels

    💡 IdeaIdeas, sugestions or feature requests

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions