Skip to content

H20 96G显存 qwen-image-edit lora微调为什么还是报错 #1238

@xiaojun3619

Description

@xiaojun3619

torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 18.00 MiB. GPU has a total capacity of 95.00 GiB of which 6.50 MiB is free. Process 3927658 has 73.02 GiB memory in use. Process 4105349 has 21.96 GiB memory in use. Of the allocated memory 21.65 GiB is allocated by PyTorch, and 1.51 MiB is reserved by PyTorch but unallocated.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions