Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ViT Adapter Not Working With Patch Size Different From 16 #151

Open
MatCorr opened this issue Nov 3, 2023 · 1 comment
Open

ViT Adapter Not Working With Patch Size Different From 16 #151

MatCorr opened this issue Nov 3, 2023 · 1 comment

Comments

@MatCorr
Copy link

MatCorr commented Nov 3, 2023

I need to train a segmentor that uses a Transformer that has been pre-trained with patch_size=14.

I've done some adaptations in the ViT-Adapter/segmentation/mmseg_custom/models/backbones/vit_adapter.py file to allow for that, since at some points in the code patch_size was hard-coded to 16.

However, with that issue surpassed, now I'm running into some problems with the ViT-Adapter/segmentation/ops/modules/ms_deform_attn.py file, which is outputting this error when I try to train a model with patch_size 14.

File "/ViT-Adapter/segmentation/ops/modules/ms_deform_attn.py", line 105, in forward
    assert (input_spatial_shapes[:, 0] * input_spatial_shapes[:, 1]).sum() == Len_in
AssertionError

Can anyone help me as to what needs to be changed in the Deformable Attention code to allow a patch size that's different from 16?

Thanks!

@MatCorr MatCorr changed the title ViT Adapter Not Working With Patch 14 ViT Adapter Not Working With Patch Size Different From 16 Nov 3, 2023
@MatCorr
Copy link
Author

MatCorr commented Nov 3, 2023

Ok, by using the code pointed to here, I converted the weights that had been pre-trained using patch_size=14.

However, I'm still hitting the same error. I'm using the weights for segmentation, not detection, so I'm wondering if that's the issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant