Is it possible to force Pytorch to use less VRAM without using the -t parameter? #282
Unanswered
curiosport
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
When I tried to convert an image Pytorch threw an insufficient VRAM error, I was missing 2GB for Pytorch to accomplish the task, I was able to fix it using the "-t 512" parameter but now this presented some deficiencies in the quality of the image in which now some blocks appeared, so that's my question, is there any way to force Pytorch to use less VRAM? no matter if that takes more time the goal is to get the same quality as without using the -t parameter.
Or is my only option to upgrade to a GPU with more VRAM? I currently have 11GB of VRAM.
Beta Was this translation helpful? Give feedback.
All reactions