Goodhart’s Law



We’ve heard that horn sections and good and that Black music is good, so we’ve put together the ideal band.

A concept that recurs in a lot of places; replication crisis, gender in sport, Benchmarking, evolutionary hyperselection, overfitting in statistics.

Goodhart’s law is an adage named after economist Charles Goodhart, which has been phrased by Marilyn Strathern as “When a measure becomes a target, it ceases to be a good measure.”

Goodhart first advanced the idea in a 1975 article, which later became used popularly to criticize the United Kingdom government of Margaret Thatcher for trying to conduct monetary policy on the basis of targets for broad and narrow money. His original formulation was:

Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.

The verb form is fun, as in don’t goodhart yourself. Or possibly one could say to hyperselect.

c.f. Campbell’s law.

To my mind Goodhardt’s law is essentially warning us not to forget that many learning problems are adversarial and we might want more robust targets than a single loss function, such as a game theoretic equilibrium.

Incoming

Filip Piekniewski on the tendency to select bad target losses for convenience, which he analyses as a flavour of Goodhart’s law. Measuring Goodhart’s Law at OpenAI.

References


No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.