Replies: 4 comments 1 reply
-
There is a checkbox inside the extension gui "Low VRAM VAE". Maybe you tried already. |
Beta Was this translation helpful? Give feedback.
-
I have the same problem. Got a GTX 3060TI 8GB VRAM. The problem also occurs with 128x128, 5 frames, and low VRAM checks. Why could that be. I closed all programs in the background and have no problems with SD. CUDA out of memory. Tried to allocate 12.00 MiB (GPU 0; 8.00 GiB total capacity; 7.19 GiB already allocated; 0 bytes free; 7.34 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF |
Beta Was this translation helpful? Give feedback.
-
Added VAE halving, should use it way less now. |
Beta Was this translation helpful? Give feedback.
-
so no ones found a solution? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I use my laptop to get this extension, My gpu is 4060 140w laptop 8GB
I am confused with this , I use nvidia sim, my GPU is empty... anyone can do some help on this?
Beta Was this translation helpful? Give feedback.
All reactions