We can generalise high school calculus, which is about scalar functions of a scalar argument, in various ways to handle matrix-valued functions or matrix-valued arguments and still look tidy. One could generalise this further, by going to full tensor calculus. But it happens that specifically matrix/vector operations are at a useful point of complexity for lots of algorithms. (I usually want this for higher order gradient descent.)

I mention two convenient and popular formalisms for lazy matrix calculus. In practice a mix of each is often useful.

## Matrix differentials

🏗 I need to return to this and tidy it up with some examples.

A special case of tensor calculus;
where the rank of the argument and value of the function is not too big.
In this setting we often get to *cheat* and use some handy shortcuts.
Fun pain point: agreeing upon layout of derivatives, numerator vs denominator.

If our problem is nice, this often gets us a low-fuss, compact, tidy solution even for some surprising cases where it seems that more general tensors would be more natural —for which, see below.

The Matrix Calculus You Need For Deep Learning (Parr and Howard 2018)

The rough-and-ready matrix differential notation is occasionally confusing, but it has a functional analysis interpretation.

Many quick recipes: the Matrix Cookbook (Petersen and Pedersen 2012)

Brookes’ Matrix Reference Manual

More expository but not as broad,

*Old and new matrix algebra useful for statistics*(Minka 2000)autodiff-focussed: Collected Matrix Derivative Results for Forward and Reverse Mode Algorithmic Differentiation (Giles 2008)

Alan Edelman’s lectures are v/ pedagogic on matrices and calculus (and go into useful random matrix theory)

## Automating matrix calculus

- Use Mathematica+NCAlgebra to find matrix differentials.
- Soeren Laue, Matthias Mitterreiter, Joachim Giesen and Jens K. Mueller (Soeren Laue, Mitterreiter, and Giesen 2018) have a website MatrixCalculus.org which uses Ricci calculus to generate matrix calculus formulae. Bonus feature: generates both python and LaTeX code.

## Indexed tensor calculus

Filed under multilinear algebra.

## References

*Matrix Analysis*. Vol. 169. Graduate Texts in Mathematics. New York, NY: Springer.

*Advances in Automatic Differentiation*, edited by Christian H. Bischof, H. Martin Bücker, Paul Hovland, Uwe Naumann, and Jean Utke, 64:35–44. Berlin, Heidelberg: Springer Berlin Heidelberg.

*Matrix Computations*. JHU Press.

*Kronecker Products and Matrix Calculus: With Applications*. Horwood.

*Matrix Variate Distributions*. Chapman & Hall/CRC Monographs and Surveys in Pure and Applied Mathematics 104. Boca Raton: Chapman and Hall/CRC.

*Advances in Neural Information Processing Systems 31*, edited by S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett, 2750–59. Curran Associates, Inc.

*AAAI Conference on Artificial Intelligence, (AAAI)*.

*Matrix differential calculus with applications in statistics and econometrics*. 3rd ed. Wiley series in probability and statistics. Hoboken (N.J.): Wiley.

*Old and new matrix algebra useful for statistics*.

*Wiley StatsRef: Statistics Reference Online*. American Cancer Society.

*Matrix Algebra Useful for Statistics*. John Wiley & Sons.

*A Matrix Handbook for Statisticians*. Wiley.

*SIAM Review*58 (3): 377–441.

*Problems and Solutions in Introductory and Advanced Matrix Calculus*. World Scientific.

*Matrix Calculus and Zero-One Matrices: Statistical and Econometric Applications*. Cambridge ; New York: Cambridge University Press.

*SIAM Journal on Mathematics of Data Science*3 (1): 171–200.

## No comments yet. Why not leave one?