Replies: 2 comments
-
Yes. I'm too facing this issue. I'm broadcasting the |
Beta Was this translation helpful? Give feedback.
0 replies
-
Thank you! Do you mean you add |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Environment:
Checklist:
Your question:
I saw the following warning when using horovod even if I ran my code on a single node(and also multi nodes of course). Should I care the warning and how can I deal with it? The warning didn't appear when I use pytorch-DDP.
I know this error is related to the order of calling optimizer.step() and lr_scheduler.step()(ref: https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate). My code snippet is following. I certainly call lr_schreduler after optimizer.step() but the warning appeared.
Thank you.
Beta Was this translation helpful? Give feedback.
All reactions