what is the difference between litellm and vllm? #4391
YunchaoYang
announced in
General
Replies: 2 comments
-
Hey @YunchaoYang yes! You can call vllm through the litellm proxy - https://docs.litellm.ai/docs/providers/openai_compatible VLLM is openai-compatible. LiteLLM Proxy
model_list:
- model_name: my-model
litellm_params:
model: openai/<your-model-name> # add openai/ prefix to route as OpenAI provider
api_base: <model-api-base> # add api base for OpenAI compatible provider
api_key: api-key # api key to send your model
litellm --config /path/to/config.yaml
curl --location 'http://0.0.0.0:4000/chat/completions' \
--header 'Authorization: Bearer sk-1234' \
--header 'Content-Type: application/json' \
--data '{
"model": "my-model",
"messages": [
{
"role": "user",
"content": "what llm are you"
}
],
}' |
Beta Was this translation helpful? Give feedback.
0 replies
-
LiteLLM is just a proxy that handles calling different LLM API's easily |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Wondering what is the major difference between the vllm and litellm? Is there a way for them to work together?
Beta Was this translation helpful? Give feedback.
All reactions