(Nearly-)Convex relation of nonconvex problems

A particular, analytically tractable way of overparameterizaion of optimisations to make them β€œnice” in the sense of being easy to analyse, or solve, via the tools of convex optimisation. Popular in kernel methods, compressive sensing, matrix factorisation, phase retrieval, sparse coding and probably other things besides.


Francis Bach is interested in a particular specialisation, least squares relaxation. See Sums-of-squares for dummies: a view from the Fourier domain

In these last two years, I have been studying intensively sum-of-squares relaxations for optimization, learning a lot from many great research papers [1, 2], review papers [3], books [4, 5, 6, 7, 8], and even websites.

Much of the literature focuses on polynomials as the de facto starting point. While this leads to deep connections between many fields within mathematics, and many applications in various areas (optimal control, data science, etc.), the need for arguably non-natural hierarchies (at least for beginners) sometimes makes the exposition hard to follow at first, and notations a tiny bit cumbersome.


Ahmed, Ali, Benjamin Recht, and Justin Romberg. 2012. β€œBlind Deconvolution Using Convex Programming.” arXiv:1211.5608 [Cs, Math], November.
Awasthi, Pranjal, Afonso S. Bandeira, Moses Charikar, Ravishankar Krishnaswamy, Soledad Villar, and Rachel Ward. 2015. β€œRelax, No Need to Round: Integrality of Clustering Formulations.” In Proceedings of the 2015 Conference on Innovations in Theoretical Computer Science, 191–200. ITCS ’15. New York, NY, USA: ACM.
Bach, Francis. 2013. β€œConvex Relaxations of Structured Matrix Factorizations.” arXiv:1309.3117 [Cs, Math], September.
Bahmani, Sohail, and Justin Romberg. 2014. β€œLifting for Blind Deconvolution in Random Mask Imaging: Identifiability and Convex Relaxation.” arXiv:1501.00046 [Cs, Math, Stat], December.
β€”β€”β€”. 2016. β€œPhase Retrieval Meets Statistical Learning Theory: A Flexible Convex Relaxation.” arXiv:1610.04210 [Cs, Math, Stat], October.
Bubeck, SΓ©bastien. 2015. Convex Optimization: Algorithms and Complexity. Vol. 8. Foundations and Trends in Machine Learning. Now Publishers.
Diamond, Steven, Reza Takapoui, and Stephen Boyd. 2016. β€œA General System for Heuristic Solution of Convex Problems over Nonconvex Sets.” arXiv Preprint arXiv:1601.07277.
Goldstein, Tom, and Christoph Studer. 2016. β€œPhaseMax: Convex Phase Retrieval via Basis Pursuit.” arXiv:1610.07531 [Cs, Math], October.
Haeffele, Benjamin D., and Rene Vidal. 2015. β€œGlobal Optimality in Tensor Factorization, Deep Learning, and Beyond.” arXiv:1506.07540 [Cs, Stat], June.
Iguchi, Takayuki, Dustin G. Mixon, Jesse Peterson, and Soledad Villar. 2015. β€œProbably Certifiably Correct k-Means Clustering.” arXiv:1509.07983 [Cs, Math, Stat], September.
Papyan, Vardan, Jeremias Sulam, and Michael Elad. 2016. β€œWorking Locally Thinking Globally-Part II: Stability and Algorithms for Convolutional Sparse Coding.” arXiv Preprint arXiv:1607.02009.
β€”β€”β€”. 2017. β€œWorking Locally Thinking Globally: Theoretical Guarantees for Convolutional Sparse Coding.” IEEE Transactions on Signal Processing 65 (21): 5687–5701.
Sebek, Michael. 2015. β€œSpectral Factorization.” In Encyclopedia of Systems and Control, edited by John Baillieul and Tariq Samad, 1289–95. London: Springer.
Tropp, J.A. 2006. β€œJust Relax: Convex Programming Methods for Identifying Sparse Signals in Noise.” IEEE Transactions on Information Theory 52 (3): 1030–51.
Tropp, Joel A. 2006. β€œAlgorithms for Simultaneous Sparse Approximation. Part II: Convex Relaxation.” Signal Processing, Sparse Approximations in Signal and Image ProcessingSparse Approximations in Signal and Image Processing, 86 (3): 589–602.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.