Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About alpha #163

Open
andyzgj opened this issue Jul 28, 2021 · 1 comment
Open

About alpha #163

andyzgj opened this issue Jul 28, 2021 · 1 comment

Comments

@andyzgj
Copy link

andyzgj commented Jul 28, 2021

Hi, I have a quick question.
Why all 8 cells share a same alpha?

def _initialize_alphas(self):
     k = sum(1 for i in range(self._steps) for n in range(2+i))
     num_ops = len(PRIMITIVES)

     self.alphas_normal = Variable(1e-3*torch.randn(k, num_ops).cuda(), requires_grad=True)
     self.alphas_reduce = Variable(1e-3*torch.randn(k, num_ops).cuda(), requires_grad=True)
     self._arch_parameters = [
       self.alphas_normal,
       self.alphas_reduce,
     ]

The code here looks like each cell share the same alpha.
Should't each cell have a independent alpha?

@sorobedio
Copy link

according to the paper all normal cell are the same so all normal cells have the same alphas. same for the reduction cells

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants