Suspected logic bug in 'get_optional_params' #10245
Draft
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Bug: get_optional_params
Draft PR for potential logic bug in
get_optional_params
(where the conditioncustom_llm_provider == "ollama"
means it will only ever beollama
).There was also a case where
function_call
was the only thing present in thenon_default_params
but the code didn't check/try to get the value to set it asoptional_params["functions_unsupported_model"]
.The PR adjusts this behaviour such that it is the same regardless of whether
custom_llm_provider
isollama
or not in the list ofopenai_compatible_providers
.I had to make a judgement call (so happy to get feedback here) that since the
if
/elif
statement forollama
checkedtools
thenfunctions
, thatfunction_call
would come after those two. So the order of precedence is:Relevant issues
Refs: #10244
Pre-Submission checklist
Please complete all items before asking a LiteLLM maintainer to review your PR
tests/litellm/
directory, Adding at least 1 test is a hard requirement - see detailsmake test-unit
)[https://docs.litellm.ai/docs/extras/contributing_code]Type
🐛 Bug Fix
🧹 Refactoring
Changes