Skip to content

SF3D_USE_CPU=1 does not force CPU usage #85

@opsec-ai

Description

@opsec-ai
$ export SF3D_USE_CPU=1 
$ python3.10 main.py --novram --fast fp8_matrix_mult cublas_ops fp16_accumulation --disable-smart-memory

Total VRAM 4031 MB, total RAM 32036 MB
pytorch version: 2.7.0+cu126
Enabled fp16 accumulation.
Set vram state to: NO_VRAM
Disabling smart memory management
Device: cuda:0 Quadro M3000M : native
Checkpoint files will always be loaded safely.
Using pytorch attention
...
torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 20.00 MiB. GPU 0 has a total capacity of 3.94 GiB of which 6.62 MiB is free

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions