python: Q regarding support for Prompt Caching in BedrockChatCompletion #12741
Answered
by
joshuamosesb
joshuamosesb
asked this question in
Q&A
-
While using BedrockChatCompletion + ChatCompletionAgent (in py API), is Prompt Caching feature of bedrock supported by |
Beta Was this translation helpful? Give feedback.
Answered by
joshuamosesb
Jul 18, 2025
Replies: 1 comment
-
one approach could be, in bedrock_chat_completion.py: _prepare_settings_for_request based the
|
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
joshuamosesb
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
one approach could be, in bedrock_chat_completion.py: _prepare_settings_for_request based the
self.ai_model_id
and whether it supports prompt-caching, all the 4 functions mapped in MESSAGE_CONVERTERS dict could be updated / overloaded (either with optional parameter or to support prompt-caching by default) to include thecachePoint
marker