Tensor regression



Generalise your usual linear regression to multilinear regression. Useful tool: tensor decompositions. Tensorly I think is the main implementation of note.

References

Anandkumar, Anima, Rong Ge, Daniel Hsu, Sham M. Kakade, and Matus Telgarsky. 2015. β€œTensor Decompositions for Learning Latent Variable Models (A Survey for ALT).” In Algorithmic Learning Theory, edited by Kamalika Chaudhuri, Claudio Gentile, and Sandra Zilles, 19–38. Lecture Notes in Computer Science. Springer International Publishing.
Anandkumar, Animashree, Rong Ge, Daniel Hsu, Sham M. Kakade, and Matus Telgarsky. 2014. β€œTensor Decompositions for Learning Latent Variable Models.” The Journal of Machine Learning Research 15 (1): 2773–2832.
Bi, Xuan, Xiwei Tang, Yubai Yuan, Yanqing Zhang, and Annie Qu. 2021. β€œTensors in Statistics.” Annual Review of Statistics and Its Application 8 (1): 345–68.
Cui, Tiangang, and Sergey Dolgov. 2022. β€œDeep Composition of Tensor-Trains Using Squared Inverse Rosenblatt Transports.” Foundations of Computational Mathematics 22 (6): 1863–1922.
Kossaifi, Jean, Yannis Panagakis, Anima Anandkumar, and Maja Pantic. 2019. β€œTensorLy: Tensor Learning in Python.” Journal of Machine Learning Research 20 (26): 1–6.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.