Does langchain support prompt caching for AWS Bedrock? #30743
-
Checked other resources
Commit to Help
Example Code... DescriptionDoes langchain support prompt caching for AWS Bedrock? The bedrock documentation does not mention it: https://python.langchain.com/docs/integrations/chat/bedrock/ But the Anthropic documentation does explicitly: https://python.langchain.com/docs/integrations/chat/anthropic/#incremental-caching-in-conversational-applications So I assume it is not supported yet? I thought about opening an issue asking for support, but I thought I would ask here first. Note that prompt caching for Claude Sonnet 3.7 and 3.5 Haiku is no longer in preview. System Info... |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
It is implemented in ChatBedrock but not documented well! That's a great call out, will get that fixed. You can see the implementation in the PR here (you can add For ChatBedrockConverse, there's an open issue: langchain-ai/langchain-aws#326 |
Beta Was this translation helpful? Give feedback.
It is implemented in ChatBedrock but not documented well! That's a great call out, will get that fixed. You can see the implementation in the PR here (you can add
"cache_control": {"type": "ephemeral"}
to content blocks just like ChatAnthropic).For ChatBedrockConverse, there's an open issue: langchain-ai/langchain-aws#326