Question answering using own dataset #1047
Unanswered
VikasRathod314
asked this question in
Q&A
Replies: 1 comment 2 replies
-
You may need to use |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
How much text file can we pass for question answering system. is there any specific data size limitations in Lang chain?
I am getting error when i passed almost 32000 tokens. Getting error as (InvalidRequestError: This model's maximum context length is 8191 tokens, however you requested 24068 tokens (24068 in your prompt; 0 for the completion). Please reduce your prompt; or completion length.).Please suggest.
Beta Was this translation helpful? Give feedback.
All reactions