CUDA Memory Requirements #73
-
| 
         Apologies if this is covered somewhere and I missed it, what do I need to change to get this running with 4 GB of VRAM? I've tried editing the tts_memory_threshold but I'm still running out of CUDA VRAM while "Warming models". Any tips? Thanks!  | 
  
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 13 replies
-
| 
         What hardware are you using with 4GB VRAM?  | 
  
Beta Was this translation helpful? Give feedback.
-
| 
         I had an old GTX 1050 Ti laying around, was trying that out. Is that too low end?  | 
  
Beta Was this translation helpful? Give feedback.
-
| 
         pman, can you list the settings you used for your 1050ti? I'm using the same card but can't seem to find the right combination of settings to get the server to start. Always hit the 'CUDA failed with error out of memory'. Thanks in advance, ~Mark  | 
  
Beta Was this translation helpful? Give feedback.
I just pushed a change to the wisng branch. I had to guess at the actual available reported CUDA memory but it should be close. I'm showing all of the Whisper models using 3.245GB of VRAM here in int8.
wisng will likely be merged to main this week so we'd really appreciate any testing you can do with it!