Does llamaindex library support LiteLLM Router
class or any way to handle load balancing llm api calls ?
#15781
abhishekrp-ai2002
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I noticed in the Llamaindex docs for LiteLLM that it only uses the LLM Interface of litellm.
Beta Was this translation helpful? Give feedback.
All reactions