Tensor decompositions



We can think of matrices as tensors of order 2. Decomposing matrices is pretty well understood. I know little about decomposing tensors of rank higher than 2. But it is an active research area. For a flavour of the field, see maybe the tensorly decomposition example notebooks.

Fun applications listed in tntorch include

References

Anandkumar, Anima, Rong Ge, Daniel Hsu, Sham M. Kakade, and Matus Telgarsky. 2015. β€œTensor Decompositions for Learning Latent Variable Models (A Survey for ALT).” In Algorithmic Learning Theory, edited by Kamalika Chaudhuri, Claudio Gentile, and Sandra Zilles, 19–38. Lecture Notes in Computer Science. Springer International Publishing.
Anandkumar, Animashree, Rong Ge, Daniel Hsu, Sham M. Kakade, and Matus Telgarsky. 2014. β€œTensor Decompositions for Learning Latent Variable Models.” The Journal of Machine Learning Research 15 (1): 2773–2832.
Belkin, Mikhail, Luis Rademacher, and James Voss. 2016. β€œBasis Learning as an Algorithmic Primitive.” In Journal of Machine Learning Research, 446–87.
Bi, Xuan, Xiwei Tang, Yubai Yuan, Yanqing Zhang, and Annie Qu. 2021. β€œTensors in Statistics.” Annual Review of Statistics and Its Application 8 (1): 345–68.
Kossaifi, Jean, Yannis Panagakis, Anima Anandkumar, and Maja Pantic. 2019. β€œTensorLy: Tensor Learning in Python.” Journal of Machine Learning Research 20 (26): 1–6.
Rabusseau, Guillaume, and FranΓ§ois Denis. 2014. β€œLearning Negative Mixture Models by Tensor Decompositions.” arXiv:1403.4224 [Cs], March.
Robeva, E. 2016. β€œOrthogonal Decomposition of Symmetric Tensors.” SIAM Journal on Matrix Analysis and Applications 37 (1): 86–102.
Robeva, Elina, and Anna Seigal. 2016. β€œSingular Vectors of Orthogonally Decomposable Tensors.” arXiv:1603.09004 [Math], March.
Tenenbaum, J. B., and W. T. Freeman. 2000. β€œSeparating Style and Content with Bilinear Models.” Neural Computation 12 (6): 1247–83.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.