Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

using too much vram #3445

Open
Uemuet opened this issue May 11, 2024 · 0 comments
Open

using too much vram #3445

Uemuet opened this issue May 11, 2024 · 0 comments

Comments

@Uemuet
Copy link

Uemuet commented May 11, 2024

lately i get this failure that im going to the limits of my vram and i dont know why,
because i can get only render less frame batches than before and get this.

torch.cuda.OutOfMemoryError: Allocation on device 0 would exceed allowed memory. (out of memory)
Currently allocated : 27.28 GiB
Requested : 22.21 GiB
Device limit : 39.56 GiB
Free (according to CUDA): 20.81 MiB
PyTorch limit (set by user-supplied memory fraction)
: 17179869184.00 GiB

and a month before i could render 400 frames in a high resolution.
My environment is colab pro

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant