-
Notifications
You must be signed in to change notification settings - Fork 129
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Poor performance of Mask2Former based on coco-stuff-164k #141
Comments
Hello, thank you for your feedback. Can you give me some information about your environment? |
4 Python: 3.7.10 (default, Feb 26 2021, 18:47:35) [GCC 7.3.0] |
Could you please confirm whether the pre-trained weights have been successfully loaded? |
I just checked the config, and I think one possible reason is that I ran this experiment using 2 nodes, resulting in a total batch size of 16. If you are using only 1 node, you need to modify the config by changing the batch size per GPU from 1 to 2. |
Perhaps the number of nodes is crucial, let me give it a try |
Hi! It is an excellent work.
By the way, I would like to ask a question about "Mask2Former beitv2+coco-stuff-164k”.
When I reproduced the experiment of “mask2former_beitv2_adapter_large_896_80k_cocostuff164k”, the indicators were abnormal, and the detailed information is as follows.
+-------+-------+-------+
| aAcc | mIoU | mAcc |
+-------+-------+-------+
| 67.52 | 37.57 | 48.97 |
+-------+-------+-------+
2023-10-13 15:14:11,850 - mmseg - INFO - The previous best checkpoint /root/paddlejob/workspace/env_run/xiachunlong/models/baidu/adu-lab/foundation_model_reasearch/ViT-Adapter/segmentatio
n/work_dirs/mask2former_beitv2_adapter_large_896_80k_cocostuff164k_ss/best_mIoU_iter_6000.pth was removed
2023-10-13 15:14:29,770 - mmseg - INFO - Now best checkpoint is saved as best_mIoU_iter_8000.pth.
2023-10-13 15:14:29,771 - mmseg - INFO - Best mIoU is 0.3757 at 8000 iter.
yours:
+-------+------+-------+
| aAcc | mIoU | mAcc |
+-------+------+-------+
| 70.74 | 46.1 | 58.27 |
+-------+------+-------+
Do the authors know what the problem is?
The text was updated successfully, but these errors were encountered: