Option to specify that model supports Prompt Caching with OpenAI Compatible provider #1549
Closed
dleen
started this conversation in
Feature Requests
Replies: 1 comment 7 replies
-
Makes sense @dleen! I assume you'd also want to be able to set the prices for the cached input/output? |
Beta Was this translation helpful? Give feedback.
7 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
We are using a gateway to access models (similar to OpenRouter) and we are using
anthropic.claude-3-7-sonnet-20250219-v1:0
which does support prompt caching. It would be great to have a checkbox like "Computer Use" to tell Roo that this model supports prompt caching.Beta Was this translation helpful? Give feedback.
All reactions