Skip to content

CUDA out of memory for large files #2

@canadaduane

Description

@canadaduane

I'm curious if there's a way to turn it into a streaming/batch process, rather than loading the entire audio file in. For example, I tried to transcribe a 135MB file (2.5hr conversation) and it failed with "CUDA out of memory. Please try a shorter audio or reduce GPU load." (this is on a 12 GB GPU).

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions