Camelidae : Sparse Mixture of Experts (8x34b in less than 80gb) #4955
logikstate
started this conversation in
General
Replies: 2 comments
-
https://huggingface.co/serpdotai/sparsetral-16x7B-v2 also looks similar. Would be nice to run those in llama.cpp in gguf format. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Is there anyone working on this? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Anybody seen this?
https://github.com/wuhy68/Parameter-Efficient-MoE
Beta Was this translation helpful? Give feedback.
All reactions