Open
Description
Hi,
I see you support vLLM and are planning on supporting ollama and cerebras. Wouldn't it make more sense to support openAI compatible backends instead? This would include all of the above and lmstudio, but also remote providers like openAI, together.ai, openrouter, etc. Or what would the issue be with that?
Looking forward to using this project, seems really promising!
Best,
Florian
Metadata
Metadata
Assignees
Labels
No labels