MPI ram usage
#6524
Replies: 1 comment 2 replies
-
The MPI documentation is severely out of date and doesn't mention that it doesn't even work at the moment. I have a PR with a mostly functional complete redesign that also allows you to specify how to split the model amongst the nodes. At the moment it doesn't support GPUs but I've been working on fixing that. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
When this part of the readme says RAM, does it mean the amount of ram the model uses in memory? Is this how I split how much system RAM is used? How do I configure VRAM usage then for say, 2 computers each with 1x 8GB GPU and 32GB ram, but I want system 'A' to use more ram if the model is to be offloaded to RAM (not enough vram)
Beta Was this translation helpful? Give feedback.
All reactions