[fixed] multi loras in one ggml file! #1481
Replies: 7 comments 8 replies
-
That part was only proved that intelligence is illustration |
Beta Was this translation helpful? Give feedback.
-
“Your scientists were so preoccupied with whether they could, they didn’t stop to think if they should.” On more serious note, have you tested it? |
Beta Was this translation helpful? Give feedback.
-
After some test like help me list 21 type of blood test, works fine. Meanwhile that model help me understand the build.zig in llama.cpp. It's rust!!!! |
Beta Was this translation helpful? Give feedback.
-
For anyone wondering, this is a meme. The model is actually just WizardLM-uncensored. |
Beta Was this translation helpful? Give feedback.
-
To be clear, I uploaded it with new script merged.
🥲after I tested with same seed same ctx same prompt sequence, results are same, so it is not working. Sorry for anyone downloaded it. Now it works fine. |
Beta Was this translation helpful? Give feedback.
-
Cool, I'll try it out. What Llama.cpp parameters do you recommend for it. Thanks. |
Beta Was this translation helpful? Give feedback.
-
Done it again with new peft method. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Merged
Now 7B & 13B so🤷
Really fast!!!!!
base on Wizard-uncensored.
2 version
1.7b + multi languages only.
2.13b + Starcoder lora.
feel free to check
click this guy ->🤷
Beta Was this translation helpful? Give feedback.
All reactions