Allow configuration of MAX_TOKEN_OUTPUT_LENGTH for custom models #1375
ai-made-approachable
started this conversation in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
At the moment adding custom models is a real hassle, as you need to find out that you need to map your custom model to one of the existing models in this file here:
chatbot-ui/lib/chat-setting-limits.ts
Line 63 in 3cfb3f3
Otherwise you get a MAX_TOKEN_OUTPUT_LENGTH error.
I think this field should be configurable in the custom model settings, to prevent that issue.
Beta Was this translation helpful? Give feedback.
All reactions