这是indexloc提供的服务,不要输入任何密码
Skip to content

Peak Memory Usage #11

@Fuger2021

Description

@Fuger2021

👋 您好,注意到论文中提到

Notably, with our twolevel invertible models, the peak memory use per GPU is suppressed to 14.64GB, making it manageable not only for A100 but also for other GPUs like 3090 and 4090 (24GB).

但是复现时,使用单卡 3090,显存实际占用 20G 左右

配置如下

  • $T$ = 3
  • batch_size = 8 (default)
  • patch_size = 256 (default)

想问下如何将显存控制在 14G,谢谢

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions