A particular, analytically tractable way of overparameterizaion of optimisations to make them “nice” in the sense of being easy to analyse, or solve, via the tools of convex optimisation. Popular in kernel methods, compressive sensing, matrix factorisation, phase retrieval, sparse coding and probably other things besides.
Francis Bach is interested in a particular specialisation, least squares relaxation. See Sums-of-squares for dummies: a view from the Fourier domain
In these last two years, I have been studying intensively sum-of-squares relaxations for optimization, learning a lot from many great research papers [1, 2], review papers , books [4, 5, 6, 7, 8], and even websites.
Much of the literature focuses on polynomials as the de facto starting point. While this leads to deep connections between many fields within mathematics, and many applications in various areas (optimal control, data science, etc.), the need for arguably non-natural hierarchies (at least for beginners) sometimes makes the exposition hard to follow at first, and notations a tiny bit cumbersome.
No comments yet. Why not leave one?