Is it possible server to support distributed CPUs inference? #6790
Unanswered
hyperbolic-c
asked this question in
Q&A
Replies: 1 comment
-
In the repo, there is support for MPI. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Just like distributed CPU training, for example, mindspore distributed cpu train, any solution about LLM inference ? Implement inference on multiple device nodes with multiple CPUs? thanks!!
Beta Was this translation helpful? Give feedback.
All reactions