vram ,multi gpu

#8
by tangxiaochu - opened

how much vram does this model need ? for like 720 * 1280 160 frames , support multi gpu?

Knowledge Engineering Group (KEG) & Data Mining at Tsinghua University org

Our readme mentioned the situation regarding VRAM consumption; we haven't tested the memory, but generally speaking, a 32GB device can run normally.
We support multiple GPUs, please pay attention to the specific code execution in cli demo.py. Thank you.

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment