Placeholder. Various links on inference by minimising some other divergence than the Kullback Leibler divergence.

As mentioned in likelihood-free inference, this is especially interesting in the case of Bayesian inference, or more generally, distributional inference, complications ensue.

(Chu, Blanchet, and Glynn 2019):

in many fields, the object of interest is a probability distribution; moreover, the learning process is guided by a probability functional to be minimized, a loss function that conceptually maps a probability distribution to a real number [β¦] Because the optimization now takes place in the infinite- dimensional space of probability measures, standard finite-dimensional algorithms like gradient descent are initially unavailable; even the proper notion for the derivative of these functionals is unclear. We call upon on a body of literature known as von Mises calculus, originally developed in the field of asymptotic statistics, to make these functional derivatives precise. Remarkably, we find that once the connection is made, the resulting generalized descent algorithm, which we call probability functional descent, is intimately compatible with standard deep learning techniques such as stochastic gradient descent, the reparameterization trick, and adversarial training.

## Generalized Bayesian computation

## References

*Proceedings of the 32Nd International Conference on Neural Information Processing Systems*, 2478β87. NIPSβ18. USA: Curran Associates Inc.

*International Conference on Machine Learning*, 214β23.

*The Annals of Statistics*5 (3): 445β63.

*Journal of the Royal Statistical Society: Series B (Statistical Methodology)*78 (5): 1103β30.

*arXiv:1610.05627 [Math, Stat]*, October.

*arXiv:1705.07152 [Stat]*, May.

*arXiv:1810.02403 [Math]*, October.

*arXiv:2004.07052 [Physics, q-Bio, Stat]*, April.

*arXiv:1710.05053 [Cs, Stat]*, October.

*arXiv:1902.00640 [Cs, Stat]*, February.

*ICML*.

*arXiv:2202.04744 [Cs, Stat]*, February.

*von Mises calculus for statistical functionals*. Lecture Notes in Statistics 19. New York: Springer.

*Wiley StatsRef: Statistics Reference Online*. American Cancer Society.

*arXiv:1902.03175 [Cs, Stat]*, August.

*Advances in Neural Information Processing Systems 28*, edited by C. Cortes, N. D. Lawrence, D. D. Lee, M. Sugiyama, and R. Garnett, 2053β61. Curran Associates, Inc.

*International Statistical Review*70 (3): 419β35.

*arXiv:1704.00028 [Cs, Stat]*, March.

*arXiv:1705.07164 [Cs, Stat]*, May.

*International Conference on Machine Learning*, 3159β68.

*Proceedings of The 33rd International Conference on Machine Learning*, 9.

*Proceedings of the 32nd International Conference on Neural Information Processing Systems*, 2075β85. NIPSβ18. Red Hook, NY, USA: Curran Associates Inc.

*arXiv:1906.03317 [Cs, Math, Stat]*, June.

*arXiv:2104.07359 [Math, Stat]*, April.

*arXiv:2008.09165 [Cs, Math, Stat]*, May.

*arXiv:2104.03889 [Stat]*, March.

*Annual Review of Statistics and Its Application*6 (1): 405β31.

*Advances in Neural Information Processing Systems 29*, edited by D. D. Lee, M. Sugiyama, U. V. Luxburg, I. Guyon, and R. Garnett, 496β504. Curran Associates, Inc.

*Stat*10 (1): e329.

*Optimal Transport for Applied Mathematicians*. Edited by Filippo Santambrogio. Progress in Nonlinear Differential Equations and Their Applications. Cham: Springer International Publishing.

*arXiv:2011.08644 [Stat]*, February.

*ACM Transactions on Graphics*34 (4): 66:1β11.

*Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)*, 284β94. Minneapolis, Minnesota: Association for Computational Linguistics.

*Proceedings of NeurIPS 2020*.

## No comments yet. Why not leave one?