# Implicit variational inference

Variational inference without densities

May 10, 2024 — March 30, 2024

approximation

metrics

optimization

probabilistic algorithms

probability

statistics

Variational inference using generative models whose density cannot be evaluated. See Variational Inference using Implicit Models.

Even though it does not evaluate likelihoods implicit VI still seems to use KL divergence as a loss function.

There seems to be a connection to adversarial learning too.

Related concepts

## 1 References

Che, Zhang, Sohl-Dickstein, et al. 2020. “Your GAN Is Secretly an Energy-Based Model and You Should Use Discriminator Driven Latent Sampling.”

*arXiv:2003.06060 [Cs, Stat]*.
Huszár. 2017. “Variational Inference Using Implicit Distributions.”

Karaletsos. 2016. “Adversarial Message Passing For Graphical Models.”

Mohamed, and Rezende. 2015. “Variational Information Maximisation for Intrinsically Motivated Reinforcement Learning.” In

*Proceedings of the 28th International Conference on Neural Information Processing Systems - Volume 2*. NIPS’15.
Tiao, Bonilla, and Ramos. 2018. “Cycle-Consistent Adversarial Learning as Approximate Bayesian Inference.”

Tran, Ranganath, and Blei. 2017. “Hierarchical Implicit Models and Likelihood-Free Variational Inference.” In

*Advances in Neural Information Processing Systems 30*.
Uppal, Stensbo-Smidt, Boomsma, et al. 2023. “Implicit Variational Inference for High-Dimensional Posteriors.”