Restricted isometry properties

Plus incoherence, irrepresentability, and other uncertainty bounds for a sparse world, and maybe frame theory, what’s that now?



Restricted isometry properties, a.k.a. uniform uncertainty principles (E. Candès and Tao 2005; E. J. Candès, Romberg, and Tao 2006), mutual incoherence (David L. Donoho 2006; D. L. Donoho, Elad, and Temlyakov 2006), irrepresentability conditions (Zhao and Yu 2006)

This is mostly notes while I learn some definitions; expect no actual thoughts.

Recoverability conditions, as seen in sparse regression, sparse basis dictionaries, function approximation, compressed sensing etc. If you squint right you could imagine these uncertainty principles for a sparse world, or a the foundations of a particular type of sampling theory.

Terry Tao mentions the various related conditions for the compressed sensing problem, and which types of random matrices satisfy them.

Restricted Isometry

The compressed sensing formulation.

The chatty lecture notes on uniform uncertainty look fun.

The restricted isometry constant of a matrix \(A\), is the smallest constant \(\delta_s\) \((1-\delta_s(A))\|x\|_2^2\leq \|Ax\|_2^2\leq (1+\delta_s(A))\|x\|_2^2\) for all \(s\)-sparse \(x\). That is, the measurement matrix does not change the norm of sparse signals “too much”, and in particular, does not null them when \(\delta_s < 1.\)

Irrepresentability

The set up is a little different for regression-type problems, which is where “representability” comes from. Here we care also about the design, roughly, the dependence of covariates we actually observe, and the noise distribution.

Zhao and Yu (2006) present an abstract condition called strong irrepresentability, which guarantees asymptotic sign consistency of selection. See also Meinshausen and Bühlmann (2006), who call this neighborhood stability, which is even less catchy.

More recently Meinshausen and Yu (2009) extend this (and explain the original irrepresentability more clearly IMO):

Here we examine the behavior of the Lasso estimators if the irrepresentable condition is relaxed. Even though the Lasso cannot recover the correct sparsity pattern, we show that the estimator is still consistent in the l2-norm sense for fixed designs under conditions on (a) the number \(s_n\) of nonzero components of the vector \(\beta_n\) and (b) the minimal singular values of design matrices that are induced by selecting small subsets of variables. Furthermore, a rate of convergence result is obtained on the l2 error with an appropriate choice of the smoothing parameter.

They do a good gob of uniting prediction-error and model-selection consistency approaches. In fact, I will base everything off Meinshausen and Yu (2009), since not only is the prose lucid, it gives the background to the design assumptions and relaxation of coherence.

TBC.

Incoherence

A Basis-Pursuit noise-free setting.

D. L. Donoho, Elad, and Temlyakov (2006):

We can think of the atoms in our dictionary as columns in a matrix \(\Phi\), so that \(\Phi\) is \(n\) by \(m\) and \(m > n.\). A representation of \(y\in\mathbb{R}^n\) can be thought of as a vector \(\alpha\in\mathbb{R}^m\) satisfying \(y=\Phi\alpha.\)

The concept of mutual coherence of the dictionary […] is defined, assuming that the columns of are normalized to unit \(\ell^2\)-norm, in terms of the Gram matrix \(G=\Phi^T\Phi\). With \(G(k,j)\) denoting entries of this matrix, the mutual coherence is

\[ M(\Phi) = \max_{1\leq k, j\leq m, k\neq j} |G(k,j)| \]

A dictionary is incoherent if \(M\) is small.

Frame theory

See frames.

References

Adcock, Ben, Anders C. Hansen, and Bogdan Roman. 2015. The Quest for Optimal Sampling: Computationally Efficient, Structure-Exploiting Measurements for Compressed Sensing.” In Compressed Sensing and Its Applications: MATHEON Workshop 2013, edited by Holger Boche, Robert Calderbank, Gitta Kutyniok, and Jan Vybíral, 143–67. Applied and Numerical Harmonic Analysis. Cham: Springer International Publishing.
Baraniuk, Richard G., Volkan Cevher, Marco F. Duarte, and Chinmay Hegde. 2010. Model-Based Compressive Sensing.” IEEE Transactions on Information Theory 56 (4): 1982–2001.
Baraniuk, Richard, Mark Davenport, Ronald DeVore, and Michael Wakin. 2008. A Simple Proof of the Restricted Isometry Property for Random Matrices.” Constructive Approximation 28 (3): 253–63.
Barron, Andrew R., Albert Cohen, Wolfgang Dahmen, and Ronald A. DeVore. 2008. Approximation and Learning by Greedy Algorithms.” The Annals of Statistics 36 (1): 64–94.
Cai, T. Tony, Guangwu Xu, and Jun Zhang. 2008. On Recovery of Sparse Signals via ℓ1 Minimization.” arXiv:0805.0149 [Cs], May.
Candès, Emmanuel J. 1999. Harmonic Analysis of Neural Networks.” Applied and Computational Harmonic Analysis 6 (2): 197–218.
Candès, Emmanuel J., Yonina C. Eldar, Deanna Needell, and Paige Randall. 2011. Compressed Sensing with Coherent and Redundant Dictionaries.” Applied and Computational Harmonic Analysis 31 (1): 59–73.
Candès, Emmanuel J., Justin K. Romberg, and Terence Tao. 2006. Stable Signal Recovery from Incomplete and Inaccurate Measurements.” Communications on Pure and Applied Mathematics 59 (8): 1207–23.
Candès, Emmanuel J., and Terence Tao. 2006. Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies? IEEE Transactions on Information Theory 52 (12): 5406–25.
———. 2008. “The Uniform Uncertainty Principle and Compressed Sensing.”
Candès, Emmanuel, and Terence Tao. 2005. Decoding by Linear Programming.” IEEE Transactions on Information Theory 51 (12): 4203–15.
Christensen, Ole. 2016. An Introduction to Frames and Riesz Bases. Second edtion. Applied and Numerical Harmonic Analysis. Cham: Springer International Publishing.
Daubechies, I. 1990. The Wavelet Transform, Time-Frequency Localization and Signal Analysis.” IEEE Transactions on Information Theory 36 (5): 961–1005.
Daubechies, Ingrid. 1992. Ten lectures on wavelets. Philadelphia, Pa: Society for Industrial and Applied Mathematics (SIAM, 3600 Market Street, Floor 6, Philadelphia, PA 19104).
Daubechies, Ingrid, Ronald DeVore, Massimo Fornasier, and C. Si̇nan Güntürk. 2010. Iteratively Reweighted Least Squares Minimization for Sparse Recovery.” Communications on Pure and Applied Mathematics 63 (1): 1–38.
Donoho, D. L., M. Elad, and V. N. Temlyakov. 2006. Stable Recovery of Sparse Overcomplete Representations in the Presence of Noise.” IEEE Transactions on Information Theory 52 (1): 6–18.
Donoho, David L. 2006. Compressed Sensing.” IEEE Transactions on Information Theory 52 (4): 1289–1306.
Donoho, David L., and Michael Elad. 2003. Optimally Sparse Representation in General (Nonorthogonal) Dictionaries via ℓ1 Minimization.” Proceedings of the National Academy of Sciences 100 (5): 2197–2202.
Duffin, R. J., and A. C. Schaeffer. 1952. A Class of Nonharmonic Fourier Series.” Transactions of the American Mathematical Society 72 (2): 341–66.
Flammia, Steven T., David Gross, Yi-Kai Liu, and Jens Eisert. 2012. Quantum Tomography via Compressed Sensing: Error Bounds, Sample Complexity, and Efficient Estimators.” New Journal of Physics 14 (9): 095022.
Foygel, Rina, and Nathan Srebro. 2011. Fast-Rate and Optimistic-Rate Error Bounds for L1-Regularized Regression.” arXiv:1108.0373 [Math, Stat], August.
Geer, Sara van de. 2014. Worst Possible Sub-Directions in High-Dimensional Models.” In arXiv:1403.7023 [Math, Stat]. Vol. 131.
Hegde, Chinmay, and Richard G. Baraniuk. 2012. Signal Recovery on Incoherent Manifolds.” IEEE Transactions on Information Theory 58 (12): 7204–14.
Heil, C., and D. Walnut. 1989. Continuous and Discrete Wavelet Transforms.” SIAM Review 31 (4): 628–66.
Jung, Alexander, Nguyen Tran Quang, and Alexandru Mara. 2017. When Is Network Lasso Accurate? arXiv:1704.02107 [Stat], April.
Kovačević, Jelena, and Amina Chebira. 2008. An Introduction to Frames. Vol. 2.
Krahmer, Felix, Gitta Kutyniok, and Jakob Lemvig. 2014. Sparse Matrices in Frame Theory.” Computational Statistics 29 (3-4): 547–68.
Lederer, Johannes, and Michael Vogt. 2020. Estimating the Lasso’s Effective Noise.” arXiv:2004.11554 [Stat], April.
Meinshausen, Nicolai, and Peter Bühlmann. 2006. High-Dimensional Graphs and Variable Selection with the Lasso.” The Annals of Statistics 34 (3): 1436–62.
Meinshausen, Nicolai, and Bin Yu. 2009. Lasso-Type Recovery of Sparse Representations for High-Dimensional Data.” The Annals of Statistics 37 (1): 246–70.
Mixon, Dustin G. 2012. Sparse Signal Processing with Frame Theory.”
Morgenshtern, Veniamin I., and Helmut Bölcskei. 2011. A Short Course on Frame Theory.”
Needell, Deanna, and Roman Vershynin. 2009. Uniform Uncertainty Principle and Signal Recovery via Regularized Orthogonal Matching Pursuit.” Foundations of Computational Mathematics 9 (3): 317–34.
Raskutti, Garvesh, Martin J. Wainwright, and Bin Yu. 2010. Restricted Eigenvalue Properties for Correlated Gaussian Designs.” 2 11 (Aug): 2241–59.
Ravikumar, Pradeep, Martin J. Wainwright, Garvesh Raskutti, and Bin Yu. 2011. High-Dimensional Covariance Estimation by Minimizing ℓ1-Penalized Log-Determinant Divergence.” Electronic Journal of Statistics 5: 935–80.
Rish, Irina, and Genady Grabarnik. 2014. Sparse Signal Recovery with Exponential-Family Noise.” In Compressed Sensing & Sparse Filtering, edited by Avishy Y. Carmi, Lyudmila Mihaylova, and Simon J. Godsill, 77–93. Signals and Communication Technology. Springer Berlin Heidelberg.
Soni, Akshay, and Yashar Mehdad. 2017. RIPML: A Restricted Isometry Property Based Approach to Multilabel Learning.” arXiv:1702.05181 [Cs, Stat], February.
Wu, R., W. Huang, and D. R. Chen. 2013. The Exact Support Recovery of Sparse Signals With Noise via Orthogonal Matching Pursuit.” IEEE Signal Processing Letters 20 (4): 403–6.
Young, Robert M. 2001. An Introduction to Non-Harmonic Fourier Series, Revised Edition, 93. Academic Press.
Zhang, Cun-Hui. 2010. Nearly Unbiased Variable Selection Under Minimax Concave Penalty.” The Annals of Statistics 38 (2): 894–942.
Zhao, Peng, and Bin Yu. 2006. On Model Selection Consistency of Lasso.” Journal of Machine Learning Research 7 (Nov): 2541–63.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.