You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What's your input tensor? Can you produce a minimal snippet of code to reproduce your error? It seems your tensor is very sparse / could it also be low-rank?
It also seems the range might be problematic - I think that's also link with another thing I want to add in tensorLy: optional normalization of the input tensor (and unnormalizing directly in factorized form) @cohenjer@aarmey
As a small test, could you try i) lower rank and ii) normalizing your tensor (e.g. remove mean and divide by std).
Just to complete @JeanKossaifi's great answer, you can easily normalize a tensor T using T = T/tl.norm(T) but centering would make some values negative in the tensor so I would not do it in the context of nonnegative decomposition.
Something bad which may also occur is if whole slices of the tensor are zero. non_negative_tucker is based on multiplicative updates, you may be dividing by zero at some point in the algorithm. There is a safeguard (a small epsilon 1e-12 is added) but maybe with your precision this epsilon is too small and considered 0?
Code:
dat is
The facs_overall
Could someone provide an explanation for this?
The text was updated successfully, but these errors were encountered: