Replies: 1 comment
-
Proper way to set the max_tokens value is to do it in a query method like:
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
I have just jumped into the world of the langchain. Using some online tutorials I managed to setup a working colab using OpenAI API and Langchain, but I use VectorstoreIndexCreator and it seems like I can't change the max_tokens variable for OpenAI API and it outputs only 256 tokens every time.
How to increase the response max length?
I use the code below:
Beta Was this translation helpful? Give feedback.
All reactions