I can't get Bedrock settings to work #11443
recrudesce
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
So, it seems that performanceConfig was added to Bedrock capability back in March as per #7606 (comment) - I found the documentation here: https://docs.litellm.ai/docs/providers/bedrock#usage---latency-optimized-inference
But if I configure as per that documentation, I get the following error during inference:
litellm.BadRequestError: BedrockException - validationException {"message":"The model returned the following errors: performanceConfig: Extra inputs are not permitted"}
Config is as follows:
Same goes if I try setting the reasoningEffort as per the documentation - I set
reasoning_effort: "low"
in my config and it throws400: litellm.UnsupportedParamsError: bedrock does not support parameters: {'reasoning_effort': 'low'}, for model=us.anthropic.claude-3-7-sonnet-20250219-v1:0. To drop these, set
litellm.drop_params=Trueor for proxy:
What am I doing wrong when I'm following the documentation exactly for these options ?
Beta Was this translation helpful? Give feedback.
All reactions