Replies: 1 comment
-
Besides checking the model page, If you run For example, with mistral-openorca you'll see this in the console output:
Meaning, the intended max context length is 32k in this case. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Is there any way I can find the maximum context length of a local LLM?
Beta Was this translation helpful? Give feedback.
All reactions