Add a custom context/token limit for models. #2422
root-reindeer-flotilla
started this conversation in
Feature Requests
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
With the release of Gemini 2.5 preview, the cost isn't as bad as some other models, but it really eats tokens if you use the Boomerang method, where each subtask has to send a lot of tokens. Or just dealing with a very large context.
I suggest, if it's possible, to add a slider that we can limit what Roo thinks the token context window size is. Gemini 2.5 has a 1 million context window, but like I said, that really adds up very fast. Obviously this would reduce the effectiveness, but it would be a good option to help control costs for smaller projects or tasks. And of course this would only go down, you can't increase the context limit on our end, don't think that's what I'm asking for.
Beta Was this translation helpful? Give feedback.
All reactions