Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

礼貌提问并虚心请教:线性衰减学习率政策 #1627

Open
For0 opened this issue Feb 29, 2024 · 0 comments
Open

礼貌提问并虚心请教:线性衰减学习率政策 #1627

For0 opened this issue Feb 29, 2024 · 0 comments

Comments

@For0
Copy link

For0 commented Feb 29, 2024

作者您好!礼貌提问并虚心请教:线性衰减学习率中,lr_l = 1.0 - max(0, epoch + opt.epoch_count - opt.n_epochs) / float(opt.n_epochs_decay + 1)和超参数opt.n_epochs_decay都表示从第100个epoch学习率开始下降,但是opt.n_epochs说明第100个epoch还保持初始学习率,感觉矛盾需要修改!希望得到回复,爱来自瓷器!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant