Replies: 2 comments 9 replies
-
Meta casually trains it on 16k GPUs, so we have a limit :) |
Beta Was this translation helpful? Give feedback.
1 reply
-
So even if you use 4b version you still need 512 GB RAM ... |
Beta Was this translation helpful? Give feedback.
8 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
can someone take a guess what's the most powerful amd epyc 9000 series and motherboard and ddr ram types and what kind of t/s we can expect to get to load a 400GB llama 3 in future?
Beta Was this translation helpful? Give feedback.
All reactions