# Precision matrix estimation

## Especially Gaussain Estimating the thing that is given to you by oracles in statistics homework assignments: the covariance matrix or its inverse, the precision matrices. Or, if you data is indexed in some fashion, the covariance kernel. We are especially interested in this in Gaussian processes, where the covariance kernel characterises the process up to its mean.

I am not introducing a complete theory of covariance estimation here, merely mentioning a couple of tidbits for future reference.

Two big data problems problems can arise here: large $$p$$ (ambient dimension) and large $$n$$ (sample size). Large $$p$$ is a problem because the covariance matrix is a $$p \times p$$ matrixand frequently we need to invert it to calculate some target estimand.

Often life can be made not too bad for large $$n$$ with Gaussian structure because, essentially, it has a nice exponential family structure and hence has sufficient statistics.

## The obvious way

Estimate the covariance matrix then invert it. This is the baseline. 🏗

## Bayesian

🏗 Wishart priors?