THIS NEEDS TO BE FIXED ASAP!! #253
SenTheNastyDynast
started this conversation in
Ideas
Replies: 1 comment
-
A batch size of 16 should easily fit in a 3090's memory. I own several 3090s and have explicitly tested this. I'm not sure what is wrong with your set-up. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
After lots of troubleshooting I was finally able to find out why this wasn't working on my 3090, this is literally setting a batch size of 16 which is bonkers, and completely unnecessary. I don't know nor want to know how to do a commit, I'm just letting you guys know, set the batch size on the api.py to 4 or lower and the thing will run without issues if you are having problems.
Beta Was this translation helpful? Give feedback.
All reactions