Replies: 1 comment
-
Just to clarify. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
Hi I am using typescript langchain , and creating an instance of ChatVertextAI.
I cant figure out the maxOutputTokens parameter.
I would like to avoid setting it and having the model always use the maximum tokens allowed for example with Gemini 2.0 flash its 8192.
since I have different models I dont want to store the max tokens of each llm. and would prefer to always use the maximum available.
Is there an easy way ?. I see that if I dont define maxOutputTokens the response is sometimes cut off . I would have expected the behaviour to be that if maxOutputTokens is not defined use the maximum.
Question: does langchain library in anyway set a default (couldnt find any mention but still asking)
if its not langchain so is this the default of the model itself when maxOutputTokens is not defined.?
Thanks.
System Info
using langchain 0.3.12 and
Beta Was this translation helpful? Give feedback.
All reactions