Replies: 1 comment
-
It should just work across multiple gpus, can you share your log? and did you build with cuda support? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I have a rig with 8 a4000s and I am wondering if it even possible to run the 70b version on that. I have the 70b gguf model but it seems the model can't run across multiple gpu's. I new to this stuff no flame. Thanks for any help
Beta Was this translation helpful? Give feedback.
All reactions