GPT-3.5 Model Context Length Issue #1305
evannaderi
started this conversation in
Help
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Whenever I try to use GPT-3.5, it alerts me that this model only supports 4097 tokens and that i request too many because of the messages and because of 4096 in the completion. It does this even when I turn the context length way down to 2000.
Beta Was this translation helpful? Give feedback.
All reactions