Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

5.5.1 节 LeNet 全连接层的参数有问题 #159

Open
klyue opened this issue Oct 31, 2020 · 2 comments
Open

5.5.1 节 LeNet 全连接层的参数有问题 #159

klyue opened this issue Oct 31, 2020 · 2 comments

Comments

@klyue
Copy link

klyue commented Oct 31, 2020

bug描述

self.fc = nn.Sequential(
            nn.Linear(16*4*4, 120),
            nn.Sigmoid(),
            nn.Linear(120, 84),
            nn.Sigmoid(),
            nn.Linear(84, 10)
)
第二行nn.Linear的第一个参数是不是应该是16*5*5
英文版的对应这一行的代码就是
nn.Linear(in_features=16*5*5, out_features=120),

**版本信息**
pytorch:
torchvision:
torchtext:
...
@jianli-Alex
Copy link

jianli-Alex commented Oct 31, 2020

其实没有问题,它这里输入是batch_sizex1x28x28,所以是16*4*4. 英文版的输入可能是batch_sizex1x32x32(或者用了batch_sizex1x28x28, 在第一个卷积中加了padding=2), 因为没看过英文版所以不清楚是哪一种,但98年的LeNet论文输入的图像确实是32x32

@Desperat1on
Copy link

Desperat1on commented Jul 19, 2022

其实没有问题,它这里输入是batch_sizex1x28x28,所以是1644. 英文版的输入可能是batch_sizex1x32x32(或者用了batch_sizex1x28x28, 在第一个卷积中加了padding=2), 因为没看过英文版所以不清楚是哪一种,但98年的LeNet论文输入的图像确实是32x32

感谢,刚开始还以为有问题,原论文用的mnist,每张图像高和宽均是32像素,而本例用的fashion-mnist,每张图像高和宽均是28像素

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants