Replies: 1 comment
-
I think i found the answer here https://docs.litellm.ai/docs/providers/openai#using-openai-proxy-with-litellm |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
So I am using jan.ai to manage my list of LLMs, and uses the built-in API server to host the LLM service. This is how I connect to it in Python with OpenAI library
I don't use litellm directly, but through dspy. However, when dspy tries to get completion through litellm, it does this under the hood
and it insist I need to provide a LLM provider with the
BadRequestError
, is there a way to bypass the provider part?Beta Was this translation helpful? Give feedback.
All reactions