Skip to content

python: Q regarding support for Prompt Caching in BedrockChatCompletion #12741

Answered by joshuamosesb
joshuamosesb asked this question in Q&A
Discussion options

You must be logged in to vote

one approach could be, in bedrock_chat_completion.py: _prepare_settings_for_request based the self.ai_model_id and whether it supports prompt-caching, all the 4 functions mapped in MESSAGE_CONVERTERS dict could be updated / overloaded (either with optional parameter or to support prompt-caching by default) to include the cachePoint marker

                "cachePoint": {
                    "type": "default"
                }
            }
in their respective return value, to enable prompt-caching in the request .
Pls review and share feedback, or post another comment if there could be any other better approach.

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by joshuamosesb
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
python Pull requests for the Python Semantic Kernel triage
1 participant