-
-
Notifications
You must be signed in to change notification settings - Fork 2.7k
Add llamafile
as a provider
#10203
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Add llamafile
as a provider
#10203
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
de6072a
to
6466488
Compare
6466488
to
11eceb3
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is missing a test to confirm it would
- work with
.completion(model="llamafile/<my-custom-model",..)
- work with
get_optional_params()
-> used by completion for translating params
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @krrishdholakia, would test_completion_with_custom_llamafile_model
satisfy this?
11eceb3
to
e0dacfe
Compare
…ude Llamafile in the sidebar
8feba56
to
b63eb3d
Compare
b63eb3d
to
57a5ff9
Compare
57a5ff9
to
293d751
Compare
Title
Implement
llamafile
as a supported provider.Relevant issues
Closes: #3225
Pre-Submission checklist
Please complete all items before asking a LiteLLM maintainer to review your PR
tests/litellm/
directory, Adding at least 1 test is a hard requirement - see detailsmake test-unit
)[https://docs.litellm.ai/docs/extras/contributing_code]Type
🆕 New Feature
📖 Documentation
✅ Test
Changes
llamafile
, including viamodel
(e.g.model="llamafile/mistralai/Mistral-7B-Instruct-v0.2"
)llamafile
, API key defaults tofake-api-key
if it isn't configured by the user or available in the secret manager (underLLAMAFILE_API_KEY
)llamafile
, base API URL will fallback tohttp://127.0.0.1:8080/v1
if it isn't configured by the user or available in the secret manager (underLLAMAFILE_API_BASE
)Vercel URLs:
Receipts