Matrix measure concentration inequalities and bounds
November 25, 2014 — March 8, 2021
Concentration inequalities for matrix-valued random variables, which are, loosely speaking, promises that some matrix is close to some known value measured in some metric with some probability.
Recommended overviews are J. A. Tropp (2015);van Handel (2017);Vershynin (2018).
1 Matrix Chernoff
J. A. Tropp (2015) summarises:
In recent years, random matrices have come to play a major role in computational mathematics, but most of the classical areas of random matrix theory remain the province of experts. Over the last decade, with the advent of matrix concentration inequalities, research has advanced to the point where we can conquer many (formerly) challenging problems with a page or two of arithmetic.
Are these related?
Nikhil Srivastava’s Discrepancy, Graphs, and the Kadison-Singer Problem has an interesting example of bounds via discrepancy theory (and only indirectly probability). D. Gross (2011) is also readable and gives results for matrices over the complex field.
2 Matrix Chebychev
As discussed in, e.g. Paulin, Mackey, and Tropp (2016).
Let
3 Matrix Bernstein
TBC. Bounds the spectral norm.
4 Matrix Efron-Stein
The “classical” Efron-Stein inequalities are simple. The Matrix ones, not so much
e.g. Paulin, Mackey, and Tropp (2016).
5 Gaussian
Handy results from Vershynin (2018):
Takes
Show that, for any fixed vectors
Given a vector
Further, we know that
Grothendieck’s identity: For any fixed vectors
- Nick Higham, Eigenvalue Inequalities for Hermitian Matrices