-
Notifications
You must be signed in to change notification settings - Fork 281
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Personalized loss functions for tensor decompositions #486
Comments
If you want to use gradient-based optimization the easiest might be to just use TensorLy-Torch and plug in your loss directly. Would be great to have more losses supported in the main library too if you're interested in opening a PR! |
So, it turns out we already have some implementation of GCP @JeanKossaifi @earmingol, but it is not yet merged in Tensorly mainly because we don't support all the backends. The implementation was done by @caglayantuna and the repo is here: There are a few examples in the doc, hope this helps! |
Wow, this is cool! thanks @cohenjer, once I have a chance I'll start playing with it and see how I could contribute :) @JeanKossaifi, I'm interested in trying tensorly-torch, but I'm not very familiar with using it beyond creating layers for bigger models. Do you have any quick example for running a decomposition, showing from how passing a tensor as input to obtaining the resulting factors? Thanks! |
Yes, I need to add some documentation but it should be fairly straightforward -- I tried to make things as easy as possible :) Create a factorized tensor just like a PyTorch tensor
You can use your favourite optimizer to update the factors of the decomposition
Then you can just create a custom loss and apply it to the reconstruction:
You'd of course want to add in a scheduler to decrease learning rate every couple of iterations, etc. |
@earmingol did you get to give it a try, did it work for your usecase? |
@JeanKossaifi unfortunately I got swamped by other things! I had a chance to try @cohenjer approach, which works well but outputs somehow unexpected results. Also, I tried implementing yours but wasn't able to get to the point to output results. That's something I need to do soon though! I'll let you know if I have any progress or questions on that :) |
@earmingol Feel free to contact me directly if tensorly-gcp has issues. I plan to debug it at some point properly, so any feedback (even saying some functionality worked as expected) would be welcome! |
Any option to implement an easy way to "plug and play" different loss functions in the tensor decompositions?
It would be great having something like that instead of editing each of the different functions.
Tamara Kolda suggested more pertinent loss functions in a talk she gave in 2018, depending on the type of data. For example, she suggested using the Rayleigh one for non-negative CP, or the Boolean-Odds for tensors with values 0s and 1s.
However, it looks like that most of the decompositions implemented here only use the standard loss function, and changing that would require directly editing each of the methods.
The text was updated successfully, but these errors were encountered: