-
Couldn't load subscription status.
- Fork 1
Open
1 / 11 of 1 issue completedLabels
questionFurther information is requestedFurther information is requested
Description
I am using a server with two 24GB 3090 GPUs. When I run the bash run_baselines_lora.sh script, I encounter an error indicating insufficient GPU memory. How can the code be configured to support dual GPUs, or if it cannot support dual GPUs, what can I do to get it running on a single 3090?
Sub-issues
Metadata
Metadata
Assignees
Labels
questionFurther information is requestedFurther information is requested