Maybe a method to overcome model context length limitations #883
cexer
started this conversation in
Feature Requests
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Even Claude only supports max 200K contexts, which is not enough for larger projects or for more iterations.
There is an optimization method that I think of, don't know if it is feasible:
1.When task is completed and archived as a file, make a summary, which can include a reference to related file path (rather than the full file content).
2.In subsequent tasks, the request contains only the summary text of the previous tasks, rather than the full conversation contents.
3.If more specific information from the previous Task is needed, retrieve the archive file of the previous Task to read.
There is a project use this method : https://github.com/sztimhdd/Looping-Claude
Beta Was this translation helpful? Give feedback.
All reactions