Replies: 1 comment
-
Do you use |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am using yi-34b-chat.Q4_K_M.gguf with llama.cpp , but I have issues like this:
yi-error.mp4
Why the llm continue print out things after
<|im_end|>
. How can I config it to make it work like chatgpt.Beta Was this translation helpful? Give feedback.
All reactions