You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I need to train a segmentor that uses a Transformer that has been pre-trained with patch_size=14.
I've done some adaptations in the ViT-Adapter/segmentation/mmseg_custom/models/backbones/vit_adapter.py file to allow for that, since at some points in the code patch_size was hard-coded to 16.
However, with that issue surpassed, now I'm running into some problems with the ViT-Adapter/segmentation/ops/modules/ms_deform_attn.py file, which is outputting this error when I try to train a model with patch_size 14.
File "/ViT-Adapter/segmentation/ops/modules/ms_deform_attn.py", line 105, in forward
assert (input_spatial_shapes[:, 0] * input_spatial_shapes[:, 1]).sum() == Len_in
AssertionError
Can anyone help me as to what needs to be changed in the Deformable Attention code to allow a patch size that's different from 16?
Thanks!
The text was updated successfully, but these errors were encountered:
MatCorr
changed the title
ViT Adapter Not Working With Patch 14
ViT Adapter Not Working With Patch Size Different From 16
Nov 3, 2023
I need to train a segmentor that uses a Transformer that has been pre-trained with
patch_size=14
.I've done some adaptations in the
ViT-Adapter/segmentation/mmseg_custom/models/backbones/vit_adapter.py
file to allow for that, since at some points in the code patch_size was hard-coded to 16.However, with that issue surpassed, now I'm running into some problems with the
ViT-Adapter/segmentation/ops/modules/ms_deform_attn.py
file, which is outputting this error when I try to train a model with patch_size 14.Can anyone help me as to what needs to be changed in the Deformable Attention code to allow a patch size that's different from 16?
Thanks!
The text was updated successfully, but these errors were encountered: