Skip to content

Revert "Support max_completion_tokens on Mistral" #9604

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Mar 28, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 1 addition & 8 deletions litellm/llms/mistral/mistral_chat_transformation.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,9 +28,7 @@ class MistralConfig(OpenAIGPTConfig):

- `top_p` (number or null): An alternative to sampling with temperature, used for nucleus sampling. API Default - 1.

- `max_tokens` [DEPRECATED - use max_completion_tokens] (integer or null): This optional parameter helps to set the maximum number of tokens to generate in the chat completion. API Default - null.

- `max_completion_tokens` (integer or null): This optional parameter helps to set the maximum number of tokens to generate in the chat completion. API Default - null.
- `max_tokens` (integer or null): This optional parameter helps to set the maximum number of tokens to generate in the chat completion. API Default - null.

- `tools` (list or null): A list of available tools for the model. Use this to specify functions for which the model can generate JSON inputs.

Expand All @@ -48,7 +46,6 @@ class MistralConfig(OpenAIGPTConfig):
temperature: Optional[int] = None
top_p: Optional[int] = None
max_tokens: Optional[int] = None
max_completion_tokens: Optional[int] = None
tools: Optional[list] = None
tool_choice: Optional[Literal["auto", "any", "none"]] = None
random_seed: Optional[int] = None
Expand All @@ -61,7 +58,6 @@ def __init__(
temperature: Optional[int] = None,
top_p: Optional[int] = None,
max_tokens: Optional[int] = None,
max_completion_tokens: Optional[int] = None,
tools: Optional[list] = None,
tool_choice: Optional[Literal["auto", "any", "none"]] = None,
random_seed: Optional[int] = None,
Expand All @@ -84,7 +80,6 @@ def get_supported_openai_params(self, model: str) -> List[str]:
"temperature",
"top_p",
"max_tokens",
"max_completion_tokens"
"tools",
"tool_choice",
"seed",
Expand All @@ -110,8 +105,6 @@ def map_openai_params(
for param, value in non_default_params.items():
if param == "max_tokens":
optional_params["max_tokens"] = value
if param == "max_completion_tokens": # max_completion_tokens should take priority
optional_params["max_tokens"] = value
if param == "tools":
optional_params["tools"] = value
if param == "stream" and value is True:
Expand Down
45 changes: 0 additions & 45 deletions tests/litellm/llms/mistral/test_mistral_transformation.py

This file was deleted.

Loading