Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

咨询adderSR问题 #64

Open
CLC0530 opened this issue Dec 8, 2021 · 2 comments
Open

咨询adderSR问题 #64

CLC0530 opened this issue Dec 8, 2021 · 2 comments

Comments

@CLC0530
Copy link

CLC0530 commented Dec 8, 2021

你好,我在复现的过程中,有几个问题想请教:

  1. 由于论文中提到需要使用shortcut来实现恒等映射,所以网络中是不是没有独立的加法层(每个加法层都包含于一个残差块),因此是否意味着原始EDSR中的一些独立卷积层,都要用一个加法残差块来替代。
  2. 在您的论文中提到:the above function (power activation function) can be easily embedded into the conventional ReLU in any SISR models. 这个具体实现思路是什么?我可以直接在ReLU后面接一个power activation function吗?
@CLC0530
Copy link
Author

CLC0530 commented Dec 8, 2021

你好,我重新阅读了一遍论文,纠正一下我的第二个问题,power activation function 置于加法层后面。对于以往正常的激活函数(如ReLU)还需要吗?需要的话它该放在哪个位置?还有就是shortcut位置是直接在加法层后面,还是在power activation function和bactchnorm的后面

@Totorol
Copy link

Totorol commented Dec 24, 2021

你好,我重新阅读了一遍论文,纠正一下我的第二个问题,power activation function 置于加法层后面。对于以往正常的激活函数(如ReLU)还需要吗?需要的话它该放在哪个位置?还有就是shortcut位置是直接在加法层后面,还是在power activation function和bactchnorm的后面

请问你复现成功了吗

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants