Tensors and stuff. Like linear algebra, but linear in multiple arguments.
Keywords: Ricci calculus, Einstein summation notation, index notation, subscript notation.
- Here is a compact explanation of Einstein summation, which turns out to be as simple as it can be, but no simpler.
- Tai-Danae Bradley
- Jeremy Kun:
- Ilan Ben-Yaacov and Francesc Roig, Index Notation for Vector Calculus
- Dan Fleisch’s Student’s Guide to Vectors and Tensors
- J.Pearson, Index Notation
- John Crimaldi, A Primer on Index Notation
- John D. Cook’s Tensor expo, Parts 2, 3
When some of the vectors are differentials.
Should this be smushed into differential geometry?
If we crack open a tensor textbook we get a lot of guff about general relativity and tensor fields and such, which is all very nice but not germane to typical machine learning applications. We want to start with the immediately-needed thing, which is some tidy notation conventions for dealing with multilinear operations without too many squiggles in our notation.
Soeren Laue, Matthias Mitterreiter, Joachim Giesen and Jens K. Mueller have been popularising such an approach recently. In their paper (Laue, Mitterreiter, and Giesen 2018), they argue that derivation of matrix differential results can be greatly simplified with Ricci calculus, and P.S. it often induces faster code.
They have a website. MatrixCalculus.org which showcases this trick to do symbolic tensor calculus online (not the accelerated code generation bit.)
Here’s tasty readings on relevant bits of tensor machinery.
See tensor decomposition.