# Matrix calculus We can generalise high school calculus, which is about scalar functions of a scalar argument, in various ways to handle matrix-valued functions or matrix-valued arguments and still look tidy. One could generalise this further, by going to full tensor calculus. But it happens that specifically matrix/vector operations are at a useful point of complexity for lots of algorithms. (I usually want this for higher order gradient descent.)

I mention two convenient and popular formalisms for lazy matrix calculus. In practice a mix of each is often useful.

## Matrix differentials 🏗 I need to return to this and tidy it up with some examples.

A special case of tensor calculus; where the rank of the argument and value of the function is not too big. In this setting we often get to cheat and use some handy shortcuts. Fun pain point: agreeing upon layout of derivatives, numerator vs denominator.

If our problem is nice, this often gets us a low-fuss, compact, tidy solution even for some surprising cases where it seems that more general tensors would be more natural —for which, see below.

## Automating matrix calculus

1. Use Mathematica+NCAlgebra to find matrix differentials.
2. Soeren Laue, Matthias Mitterreiter, Joachim Giesen and Jens K. Mueller have a website MatrixCalculus.org which uses Ricci calculus to generate matrix calculus formulae. Bonus feature: generates both python and LaTeX code.

## Indexed tensor calculus

Filed under multilinear algebra.

### No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.