Prompt caching could be more efficient if big files are read into context #3286
manueloverride
started this conversation in
Feature Requests
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
App Version 3.16.0
API Provider Google Gemini
Model Used 2.5 pro
I have a design mode in which the task of the role is to:
Then I interact with the mode by asking questions or giving tasks. However, the first few turns are more expensive than they could be, because apparently the big file doesn't get added to the prompt cache immediately.
Beta Was this translation helpful? Give feedback.
All reactions