Which Flux model should I load in the desktop version? #7397
Replies: 1 comment 5 replies
-
16GB is extremely low memory for Flux. But you could try the miracle with GGUF quantized models: Here's the example workflow, you just need to install a custom node: Here's the models you need:
This should be the lighter setup for Flux Dev, let's see if you can make it work (success is not guaranteed) |
Beta Was this translation helpful? Give feedback.
5 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Good day, everyone. I’m seeking help from the community. Could you please tell me which version of Flux runs and works stably on a MacBook Pro M1 with 16GB RAM? Which versions of other models are you running without crashes? Thank you.
Beta Was this translation helpful? Give feedback.
All reactions