We can think of matrices as tensors of order 2. Decomposing matrices is pretty well understood. I know little about decomposing tensors of rank higher than 2. For a flavour of the field, see maybe the tensorly decomposition example notebooks.
tntorch asserts the following are the most popular formats:
- CANDECOMP/PARAFAC (CP) (Kolda and Bader 2009)
- Tucker (De Lathauwer, De Moor, and Vandewalle 2000)
- Tensor Train (TT) (Oseledets 2011)
Applications listed in tntorch:
- Active Subspaces
- ANOVA Decomposition
- Arithmetics
- Automata
- Classification
- Tensor Completion
- Cross-approximation
- Tensor Decompositions
- Exponential Machines
- Boolean Logic
- Polynomial Chaos Expansions
- Sobol Indices
- Vector Fields
References
Anandkumar, Anima, Rong Ge, Daniel Hsu, Sham M. Kakade, and Matus Telgarsky. 2015. βTensor Decompositions for Learning Latent Variable Models (A Survey for ALT).β In Algorithmic Learning Theory, edited by Kamalika Chaudhuri, Claudio Gentile, and Sandra Zilles, 19β38. Lecture Notes in Computer Science. Springer International Publishing.
Anandkumar, Animashree, Rong Ge, Daniel Hsu, Sham M. Kakade, and Matus Telgarsky. 2014. βTensor Decompositions for Learning Latent Variable Models.β The Journal of Machine Learning Research 15 (1): 2773β2832.
Belkin, Mikhail, Luis Rademacher, and James Voss. 2016. βBasis Learning as an Algorithmic Primitive.β In Journal of Machine Learning Research, 446β87.
Bi, Xuan, Xiwei Tang, Yubai Yuan, Yanqing Zhang, and Annie Qu. 2021. βTensors in Statistics.β Annual Review of Statistics and Its Application 8 (1): 345β68.
Cui, Tiangang, and Sergey Dolgov. 2022. βDeep Composition of Tensor-Trains Using Squared Inverse Rosenblatt Transports.β Foundations of Computational Mathematics 22 (6): 1863β1922.
De Lathauwer, Lieven, Bart De Moor, and Joos Vandewalle. 2000. βOn the Best Rank-1 and Rank-(R1 ,R2 ,. . .,RN) Approximation of Higher-Order Tensors.β SIAM Journal on Matrix Analysis and Applications 21 (4): 1324β42.
Kolda, Tamara G., and Brett W. Bader. 2009. βTensor Decompositions and Applications.β SIAM Review 51 (3): 455β500.
Kossaifi, Jean, Nikola Borislavov Kovachki, Kamyar Azizzadenesheli, and Anima Anandkumar. 2023. βMulti-Grid Tensorized Fourier Neural Operator for High Resolution PDEs,β February.
Kossaifi, Jean, Yannis Panagakis, Anima Anandkumar, and Maja Pantic. 2019. βTensorLy: Tensor Learning in Python.β Journal of Machine Learning Research 20 (26): 1β6.
Malik, Osman Asif, and Stephen Becker. 2018. βLow-Rank Tucker Decomposition of Large Tensors Using TensorSketch.β
Oseledets, I. V. 2011. βTensor-Train Decomposition.β SIAM Journal on Scientific Computing 33 (5): 2295β2317.
Pan, Chenjian, Chen Ling, Hongjin He, Liqun Qi, and Yanwei Xu. 2020. βLow-Rank and Sparse Enhanced Tucker Decomposition for Tensor Completion.β arXiv.
Rabanser, Stephan, Oleksandr Shchur, and Stephan GΓΌnnemann. 2017. βIntroduction to Tensor Decompositions and Their Applications in Machine Learning.β
Rabusseau, Guillaume, and FranΓ§ois Denis. 2014. βLearning Negative Mixture Models by Tensor Decompositions.β arXiv:1403.4224 [Cs], March.
Robeva, E. 2016. βOrthogonal Decomposition of Symmetric Tensors.β SIAM Journal on Matrix Analysis and Applications 37 (1): 86β102.
Robeva, Elina, and Anna Seigal. 2016. βSingular Vectors of Orthogonally Decomposable Tensors.β arXiv:1603.09004 [Math], March.
Tenenbaum, J. B., and W. T. Freeman. 2000. βSeparating Style and Content with Bilinear Models.β Neural Computation 12 (6): 1247β83.
Tran, Alasdair, Alexander Mathews, Lexing Xie, and Cheng Soon Ong. 2022. βFactorized Fourier Neural Operators.β arXiv.
Zhao, Yiran, and Tiangang Cui. 2023. βTensor-Based Methods for Sequential State and Parameter Estimation in State Space Models.β arXiv.
No comments yet. Why not leave one?