-
Notifications
You must be signed in to change notification settings - Fork 655
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
NTXentLoss, normalize issue. #696
Comments
ah, I found you did the normalization step in calculating the similarity matrix. another question is, assume I have a dataset, feature vector shape is [6, 100], label vector is [0,1,0,3,3,1]. Is it similar to supervised contrasitive learning if I use this NTXentloss in this case? Thanks! |
Check out the blue drop-down box in the documentation for |
Thanks @stompsjo! Also if you're looking specifically for Supervised Contrastive Learning there is a loss function for that: https://kevinmusgrave.github.io/pytorch-metric-learning/losses/#supconloss |
Do we have to normalize the feature vector by ourself before sending these feature vectors into NTXentLoss. Because I checked the code in metric-learning, it seems that you don't include the normalize step.
Thanks in advance!
JJ
The text was updated successfully, but these errors were encountered: