(Nearly-)Convex relation of nonconvex problems
April 4, 2018 — March 17, 2023
A particular, analytically tractable way of overparameterization of optimisations to make them “nice” in the sense of being easy to analyse, or solve, via the tools of convex optimization. Popular in kernel methods, compressive sensing, matrix factorization, phase retrieval, sparse coding and probably other things besides.
1 Incoming
Francis Bach is interested in a particular specialization, least squares relaxation. See Sums-of-squares for dummies: a view from the Fourier domain
In these last two years, I have been studying intensively sum-of-squares relaxations for optimization, learning a lot from many great research papers [1, 2], review papers [3], books [4, 5, 6, 7, 8], and even websites.
Much of the literature focuses on polynomials as the de facto starting point. While this leads to deep connections between many fields within mathematics, and many applications in various areas (optimal control, data science, etc.), the need for arguably non-natural hierarchies (at least for beginners) sometimes makes the exposition hard to follow at first, and notations a tiny bit cumbersome.