Multi-objective optimisation

July 14, 2021 — May 4, 2023

Figure 1

Optimising for an objective defined as weighted sum of multiple objectives of unknown weights can be difficult. Useful in multi task learning, for example, or in weighting regularisation in regression including neural nets.

HT Cheng Soon Ong for pointing out Jonas Degrave and Ira Korshunova’s illustrated explanation of a tricky thing, Why machine learning algorithms are hard to tune (and the fix). His summary::

Machine learning hyperparameters are hard to tune. One way to think of why it is hard, is because it is a Pareto front of multiple objectives. One way to solve that problem is to look at Lagrange multipliers, as proposed by a paper in 1988 (Platt and Barr 1987).

A follow up post describes how we can make machine learning algorithms tunable.

1 References

Das, and Dennis. 1997. “A Closer Look at Drawbacks of Minimizing Weighted Sums of Objectives for Pareto Set Generation in Multicriteria Optimization Problems.” Structural Optimization.
Jakob, and Blume. 2014. Pareto Optimization or Cascaded Weighted Sum: A Comparison of Concepts.” Algorithms.
Kim, and De Weck. 2005. “Adaptive Weighted-Sum Method for Bi-Objective Optimization: Pareto Front Generation.” Structural and Multidisciplinary Optimization.
Kim, and De Weck. 2006. “Adaptive Weighted Sum Method for Multiobjective Optimization: A New Method for Pareto Front Generation.” Structural and Multidisciplinary Optimization.
Marler, and Arora. 2010. The Weighted Sum Method for Multi-Objective Optimization: New Insights.” Structural and Multidisciplinary Optimization.
Platt, and Barr. 1987. Constrained Differential Optimization.” In Proceedings of the 1987 International Conference on Neural Information Processing Systems. NIPS’87.
Ryu, Kim, and Wan. 2009. “Pareto Front Approximation with Adaptive Weighted Sum Method in Multiobjective Simulation Optimization.” In Proceedings of the 2009 Winter Simulation Conference (WSC).