Replies: 1 comment
-
Try increasing the context size with the -c 1024 flag. I think the default is 512, so that would double it. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I've been testing alpaca 30B (-t 24 -n 2000 --temp 0.2 -b 32 --n_parts 1 --ignore-eos --instruct)
I've consistently have it "stop" after 300-400 tokens output (30-40 tokens input)
No error message, no crash and given the -n 2000 and the ignore-eos no reason to stop so early
I guess it would be useful if the program provides a verbose quit reason, though in my case I can't see any reason for it to stop before token max is reached.
I'm not sure if that's a bug to report or if I am missing something.
Beta Was this translation helpful? Give feedback.
All reactions