Pólya-Gamma augmentation trick

February 20, 2017 — April 1, 2022

classification
probabilistic algorithms
probability
statistics
Figure 1

An infinite weird RV useful in Bayesian Binomial regression (and maybe other things?) (). C&C the optimization-driven approach to a similar problem in Gumbel-max tricks.

See also my former colleague, Louis Tiao, A Primer on Pólya-gamma Random Variables - Part II: Bayesian Logistic Regression.

Gregory Gunderson, in Pólya-Gamma Augmentation, explains the problem we are trying to solve.

…in logistic regression, the dependent variables are assumed to be i.i.d. from a Bernoulli distribution with parameter p, and therefore the likelihood function is L(p)n=1Npyn(1p)1yn=pyn(1p)Nyn The observations interact with the response through a linear relationship with the log-odds, log(p1p)=β0+x1β1+x2β2++xDβD=βx If we solve for p in (2), we get p=exp(βxn)1+exp(βxn) and a likelihood of L(β)[exp(βx)]yn[1+exp(βx)]N Due to this functional form, Bayesian inference for logistic regression is intractable.

Using Pólya-Gamma RVs we devise an auxiliary variable sample.

1 References

Polson, Scott, and Windle. 2013. Bayesian Inference for Logistic Models Using Pólya–Gamma Latent Variables.” Journal of the American Statistical Association.