Fun tricks in non-convex optimisation



In non-convex optimisations our ultimate destination depends upon the starting point.

With symmetries

Zhang, Qu, and Wright (2022)

In phase retrieval

See phase retrieval.

References

Choromanska, Anna, MIkael Henaff, Michael Mathieu, Gerard Ben Arous, and Yann LeCun. 2015. β€œThe Loss Surfaces of Multilayer Networks.” In Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, 192–204.
Jain, Prateek, and Purushottam Kar. 2017. Non-Convex Optimization for Machine Learning. Vol. 10.
Soltanolkotabi, M., A. Javanmard, and J. D. Lee. 2019. β€œTheoretical Insights Into the Optimization Landscape of Over-Parameterized Shallow Neural Networks.” IEEE Transactions on Information Theory 65 (2): 742–69.
Wright, John, and Yi Ma. 2022. High-dimensional data analysis with low-dimensional models: Principles, computation, and applications. S.l.: Cambridge University Press.
Zhang, Yuqian, Qing Qu, and John Wright. 2022. β€œFrom Symmetry to Geometry: Tractable Nonconvex Problems.” arXiv.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.