Skip to content

Feature request: support for SGLang and Ollama #3

Open
@rachittshah

Description

@rachittshah

Hi folks!

Since the current project only supports vLLM, I'd love to propose support for SGLang and Ollama.

Thinking behind proposal:

Many teams use SGLang over vLLM and vice versa. Adding support will allow for greater adoption.

Ollama is one of the quickest ways to get started with on prem LLMs, reduces friction and increases the DX.

Happy to shot for a PR for this!

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions