You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is really interesting work! I tried to run it on a server with a 24gb 3090, and it runs into out of memory. Is a larger GPU needed? Please let me know what size GPU you are able to run it on.
Thanks
The text was updated successfully, but these errors were encountered:
I encountered the same error and solved. In my opinion, 'CUDA out of memory' error can be solved by just adding some arguments.(e.g. --gradient_accumulation_steps=1 --gradient_checkpointing)
The below link might be help: huggingface/diffusers#696
This is really interesting work! I tried to run it on a server with a 24gb 3090, and it runs into out of memory. Is a larger GPU needed? Please let me know what size GPU you are able to run it on.
Thanks
The text was updated successfully, but these errors were encountered: