Dreambooth Stable Diffusion training in just 12.5 GB VRAM, using the 8bit adam optimizer from bitsandbytes along with xformers while being 2 times faster. #1167
ZeroCool22
started this conversation in
General
Replies: 1 comment 1 reply
-
This is really cool. You got it running locally? I just clone that repo into its own folder, run the commands listed there, and it will just work? I have a 16gb mobile video card. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Tested on Nvidia A10G, took 15-20 mins to train. We can finally run on colab notebooks.
Code: https://github.com/ShivamShrirao/diffusers/blob/main/examples/dreambooth/
More details huggingface/diffusers#554 (comment)
https://www.reddit.com/r/StableDiffusion/comments/xphaiw/dreambooth_stable_diffusion_training_in_just_125/
Beta Was this translation helpful? Give feedback.
All reactions