Replies: 1 comment
-
For finetune, I think 24G isminimum so a 3090 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I want to get a budget GPU for LLM training/fine tuning/experimentation.
I am considering:
a single rtx 4070 ti super 16gb
or a single rtx 4080 16gb
or some dual lower level gpus setting, e.g. dual rtx 4070 12gb ram
Any thoughts, advices or other recommendations?
Beta Was this translation helpful? Give feedback.
All reactions