Skip to content

代码跑着跑着就挂了,CUDA out of memory #126

@susht3

Description

@susht3

🐛 bug 说明

finetune中途突然OOM,是不是需要限制输入长度呢,请问代码内部会做截断么?目前输入长度没有做限制

torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 96.00 MiB (GPU 0; 31.74 GiB total capacity; 27.71 GiB already allocated; 91.12 MiB free; 31.22 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

Python Version

None

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions