One way I can get at the confusing behaviours of high dimensional distributions is to instead look at low dimensional projections of them. If I have a (possibly fixed) data matrix and a random dimensional projection, what distribution does the projection have?
This idea pertains to many others: matrix factorisations, restricted isometry properties, Riesz bases, randomised regression, compressed sensing. You could also consider these results as arising from/resulting in certain structured random matrices.
Tutorials
There is a confusing note soup, sorry. You might find it better to read a coherent overview such as Meckes’ lecture slides which include a lot of important recent developments, many of which she invented:
Related: Weird slicing problems in convex geometry. For a theoretical background as to how that relates, see Guédon (2014).
Inner products
Djalil Chafaï introduces the The Funk-Hecke formula, also mentioned under isotropic RVs, which gives us a formula for a particularly simple case, of unit-norm RVs:
… if is a random vector of uniformly distributed on then for all , the law of has density This law does not depend on the choice of . It is symmetric in the sense that and have the same law. The law of is the image of the law by the map . The law of is
- if : an arcsine law,
- if : a uniform law (Archimedes principle),
- if : a semicircle law.
whuber asserts that and
Random projections are kinda Gaussian
More generally things are not so exact. But they are still reasonably nice, in that there are lots of tasty limit theorems with nice regular behaviour.
A classic introductory concept: Diaconis-Freedman effect. Diaconis and Freedman (1984) show that (under some mild omitted conditions), is a data set (possibly deterministic with no assumption on generating process), is a uniform random point in the sphere and is the empirical measure of the projection of the onto , then as the measures tend to weakly in probability. This succinct statement is modeled on Elizabeth Meckes’.
A lesson is that even non-Gaussian, non-independent data can become nearly i.i.d. Gaussian in low dimensional projection, as Dasgupta, Hsu, and Verma (2006) argue in their introduction.
This has been taken to incredible depth in the work of Elizabeth Meckes 1980—2020 whose papers serve as the canonical textbook in the area for now. Two foundational ones are Chatterjee and Meckes (2008) and E. Meckes (2009) and there is a kind of user guide in E. Meckes (2012b) which leverages Stein’s method a whole bunch.
Random projections are distance preserving
What makes random embeddings go. The most famous result is the Johnson-Lindenstrauss lemma.
A simple proof of that is given by Dasgupta and Gupta (2003)
Projection statistics
Another key phrase we can look for is probability on the Stiefel manifold, which is a generalization of a familiar concept from random orthonormal matrices. Stiefel manifolds generalize an orthonormal matrix because they can map between spaces of different dimension. Formally, the Stiefel manifold is the space of frames in the -dimensional real Euclidean space represented by the set of matrices such that where is the identity matrix. There are some interesting cases in low dimensional projections served by especially
Cool results in this domain are, e.g. Chikuse and 筑瀬 (2003);E. S. Meckes and Meckes (2013);E. Meckes (2012a);Stam (1982).
General projections results are in Dümbgen and Del Conte-Zerial (2013).
An important trick is the distribution of isotropic unit vectors.
Let be a random matrix in with independent, standard Gaussian column vectors Then has the desired distribution, and
Vershynin’s writing on a variety of hard high-dimensional probability results is pretty accessible: Vershynin (2015);Vershynin (2018). These bleed over into concentration results.
I wrote one of my own… TBD.
Concentration theorems for projections
Many, e.g. Dasgupta, Hsu, and Verma (2006);Dümbgen and Del Conte-Zerial (2013);Gantert, Kim, and Ramanan (2017);Kim, Liao, and Ramanan (2020).
References
Achlioptas. 2003.
“Database-Friendly Random Projections: Johnson-Lindenstrauss with Binary Coins.” Journal of Computer and System Sciences, Special Issue on PODS 2001,.
Anttila, Ball, and Perissinaki. 2003.
“The Central Limit Problem for Convex Bodies.” Transactions of the American Mathematical Society.
Beckner. 1989.
“A Generalized Poincare Inequality for Gaussian Measures.” Proceedings of the American Mathematical Society.
Bhattacharya, and Bhattacharya. 2012.
Nonparametric Inference on Manifolds: With Applications to Shape Spaces. Institute of Mathematical Statistics Monographs.
Bingham, and Mannila. 2001.
“Random Projection in Dimensionality Reduction: Applications to Image and Text Data.” In
Proceedings of the Seventh ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. KDD ’01.
Bobkov, and Koldobsky. 2003.
“On the Central Limit Property of Convex Bodies.” In
Geometric Aspects of Functional Analysis: Israel Seminar 2001-2002. Lecture Notes in Mathematics.
Brehm, and Voigt. n.d. “Asymptotics of Cross Sections for Convex Bodiesy.”
Dasgupta. 2000.
“Experiments with Random Projection.” In
Proceedings of the Sixteenth Conference on Uncertainty in Artificial Intelligence. UAI’00.
Dasgupta, and Gupta. 2003.
“An Elementary Proof of a Theorem of Johnson and Lindenstrauss.” Random Structures & Algorithms.
Dasgupta, Hsu, and Verma. 2006.
“A Concentration Theorem for Projections.” In
Proceedings of the Twenty-Second Conference on Uncertainty in Artificial Intelligence. UAI’06.
Diaconis, and Freedman. 1984.
“Asymptotics of Graphical Projection Pursuit.” The Annals of Statistics.
Dümbgen, and Del Conte-Zerial. 2013.
“On Low-Dimensional Projections of High-Dimensional Distributions.” From Probability to Statistics and Back: High-Dimensional Models and Processes – A Festschrift in Honor of Jon A. Wellner.
Eldan, R., and Klartag. 2008.
“Pointwise Estimates for Marginals of Convex Bodies.” Journal of Functional Analysis.
Freund, Dasgupta, Kabra, et al. 2007.
“Learning the Structure of Manifolds Using Random Projections.” In
Advances in Neural Information Processing Systems.
Gantert, Kim, and Ramanan. 2017.
“Large Deviations for Random Projections of Balls.” The Annals of Probability.
Guédon. 2014.
“Concentration Phenomena in High Dimensional Geometry.” Edited by Arnaud Guillin.
ESAIM: Proceedings.
Houdré, Ledoux, and Milman. 2011. Concentration, Functional Inequalities and Isoperimetry: International Workshop on Concentration, Functional Inequalities, and Isoperimetry, October 29-November 1, 2009, Florida Atlantic University, Boca Raton, Florida.
Indyk, and Motwani. 1998.
“Approximate Nearest Neighbors: Towards Removing the Curse of Dimensionality.” In
Proceedings of the Thirtieth Annual ACM Symposium on Theory of Computing. STOC ’98.
Jiang, Lee, and Vempala. 2020.
“A Generalized Central Limit Conjecture for Convex Bodies.” In
Geometric Aspects of Functional Analysis: Israel Seminar (GAFA) 2017-2019 Volume II. Lecture Notes in Mathematics.
Joarder, and Ali. 1996.
“On the Characterization of Spherical Distributions.” Journal of Information and Optimization Sciences.
Kar, and Karnick. 2012.
“Random Feature Maps for Dot Product Kernels.” In
Artificial Intelligence and Statistics.
Klartag. 2007a.
“A Central Limit Theorem for Convex Sets.” Inventiones Mathematicae.
Lahiri, Gao, and Ganguli. 2016.
“Random Projections of Random Manifolds.” arXiv:1607.04331 [Cs, q-Bio, Stat].
Li, Hastie, and Church. 2006.
“Very Sparse Random Projections.” In
Proceedings of the 12th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. KDD ’06.
Meckes, Elizabeth. 2006. “An Infinitesimal Version of Stein’s Method of Exchangeable Pairs.”
Meckes, Elizabeth. 2009.
“On Stein’s Method for Multivariate Normal Approximation.” In
High Dimensional Probability V: The Luminy Volume.
———. 2012a.
“Projections of Probability Distributions: A Measure-Theoretic Dvoretzky Theorem.” In
Geometric Aspects of Functional Analysis: Israel Seminar 2006–2010. Lecture Notes in Mathematics.
———. 2012b.
“Approximation of Projections of Random Vectors.” Journal of Theoretical Probability.
Meckes, Elizabeth S., and Meckes. 2013.
“Concentration and Convergence Rates for Spectral Measures of Random Matrices.” Probability Theory and Related Fields.
Paouris. 2006. “Concentration of Mass on Convex Bodies.”
Peña, Lai, and Shao. 2008. Self-Normalized Processes: Limit Theory and Statistical Applications.
Reeves. 2017.
“Conditional Central Limit Theorems for Gaussian Projections.” In
2017 IEEE International Symposium on Information Theory (ISIT).
Rudelson, and Vershynin. 2013.
“Hanson-Wright Inequality and Sub-Gaussian Concentration.” Electronic Communications in Probability.
Stein. 1972.
“A Bound for the Error in the Normal Approximation to the Distribution of a Sum of Dependent Random Variables.” Proceedings of the Sixth Berkeley Symposium on Mathematical Statistics and Probability, Volume 2: Probability Theory.
———. 2015.
“Estimation in High Dimensions: A Geometric Perspective.” In
Sampling Theory, a Renaissance: Compressive Sensing and Other Developments. Applied and Numerical Harmonic Analysis.