GPT-4 Turbo completion tokens error #207
Replies: 1 comment
-
Hi @FamedBear16, sorry for the late reply, can you refer to #256 (comment) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,

I am using Copilot in Conversation mode, and set max tokens to 8,000. I am submitting a question + context of around 7900 tokens.
I am getting the following error:
However if I use GPT 3.5 16K all work well?
What is happening?
Chat GPT 4.0 has a much larger context.
I did another test
If I set the limit to 4000 Tokens, then GPT 4.0 Turbo, with work well with the same input of around 7900 tokens
Regards,
Pierpaolo
Beta Was this translation helpful? Give feedback.
All reactions