Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Evaluation of Trained SigLIP Checkpoints #824

Open
work4cs opened this issue Feb 21, 2024 · 1 comment
Open

Evaluation of Trained SigLIP Checkpoints #824

work4cs opened this issue Feb 21, 2024 · 1 comment

Comments

@work4cs
Copy link

work4cs commented Feb 21, 2024

I train a SigLIP model without problem, but when I load the training checkpoint to evaluate on the downstream task, I get error:
RuntimeError: Error(s) in loading state_dict for CLIP: Unexpected key(s) in state_dict: "logit_bias".

on incompatible_keys = model.load_state_dict(state_dict, strict=strict)

state_dict from checkpoint has the key "logit_bias", but model does not.

I can assign a value to init_logit_bias, passing to create_model_and_transforms() during evaluation, to make the model.state_dict() notice the key logit_bias. But is there a more elegant way to do so, which I do not need specify a model when create it and pass a specific variable to it but I can load the necessary keys based on the checkpoint? (I tried model.load_state_dict(state_dict, strict=**False**), but it did not load the correct value from the checkpoint. Also, I am not sure if it is secure/reliable enough to change the strict to False. )

@rwightman
Copy link
Collaborator

@work4cs a SigLIP model using one of the SigLIP model configs (e.g. https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/model_configs/ViT-B-16-SigLIP-256.json) will not have this issue, but if you train a 'non-siglip' model by enabling siglip, yes you need to pass the logit_bias OR create a new config for that model with logit bias in it, it's technically no longer the original model config that you used

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants