likelihood_free on Dan MacKinlay
https://danmackinlay.name/tags/likelihood_free.html
Recent content in likelihood_free on Dan MacKinlayHugo -- gohugo.ioen-usMon, 08 Mar 2021 18:07:53 +1100Reparameterization tricks in inference
https://danmackinlay.name/notebook/reparameterization_trick.html
Mon, 08 Mar 2021 18:07:53 +1100https://danmackinlay.name/notebook/reparameterization_trick.htmlFor variational autoencoders “Normalized” flows For density estimation Representational power of Tutorials References Approximating the desired distribution by perturbation of the available distribution
A trick in e.g. variational inference, especially autoencoders, for density estimation in probabilistic deep learning, best summarised as “fancy change of variables to that I can differentiate through the parameters of a distribution”. Connections to optimal transport and likelihood free inference in that this trick can enable some clever approximate-likelihood approaches.Generative adversarial learning
https://danmackinlay.name/notebook/adversarial_learning_generative.html
Mon, 14 Dec 2020 16:32:29 +1100https://danmackinlay.name/notebook/adversarial_learning_generative.htmlWassterstein loss/regularisation Conditional Invertible GANs as SDEs References The critic providing a gradient update to the generator
Game theory meets learning. Hip, especially in combination with deep learning, because it provides an elegant means of likelihood free inference.
I don’t know anything about it. Something about training two systems together to both generate and classify examples of a phenomenon of interest.
Sanjeev Arora gives a cogent intro He also suggests a link with learning theory.Simulation based inference
https://danmackinlay.name/notebook/simulation_based_inference.html
Tue, 25 Aug 2020 10:18:37 +1000https://danmackinlay.name/notebook/simulation_based_inference.htmlReferences Simulation-based inference, likelihood-free inference, and approximate Bayesian Computation are all terrible descriptions. There are many ways that inference can be based upon simulations, many types of freedom from likelihood and many ways to approximate Bayesian computation. However, all these terms together refer to a particular thing.
TBD: relationship between this and indirect inference. They look similar but tend not to cite each other. Is this a technical or sociological difference?Generative neural net models
https://danmackinlay.name/notebook/nn_generative.html
Sun, 14 Jun 2020 19:04:57 +1000https://danmackinlay.name/notebook/nn_generative.htmlReferences Observations arising from unobserved latent factors
Certain famous models in neural nets are generative — informally, they produce samples some distribution, and the distribution of those samples is tweaks until it resembles, say, the distribution of our observed data.
Tangent: Learning problems involve composition of differentiating and integrating various terms that measure various properties of how well you have approximated the state of the world.Probabilistic neural nets
https://danmackinlay.name/notebook/nn_probabilistic.html
Sun, 14 Jun 2020 19:04:57 +1000https://danmackinlay.name/notebook/nn_probabilistic.htmlReferences Inferring densities and distribution in a massively parameterised deep learning setting.
This is not intrinsically a Bayesian thing to do but in practice much of the demand comes from the demand for Bayesian posterior inference for neural nets, and accordingly most of the action is over there.
References Abbasnejad, Ehsan, Anthony Dick, and Anton van den Hengel. 2016. “Infinite Variational Autoencoder for Semi-Supervised Learning.” In Advances in Neural Information Processing Systems 29.Likelihood free inference
https://danmackinlay.name/notebook/likelihood_free_inference.html
Wed, 22 Apr 2020 17:36:41 +1000https://danmackinlay.name/notebook/likelihood_free_inference.htmlReferences Finding the target without directly inspecting the likelihood of the current guess
A terrible term which seems to have a couple of distinct uses; I do not yet understand which are the same.
I mean this in the sense of trying to approximate intractable likelihoods; there seems also to be a school which would like to use this term for methods which make no reference to probability densities whatever.Indirect inference
https://danmackinlay.name/notebook/indirect_inference.html
Tue, 15 Dec 2015 14:12:55 +0800https://danmackinlay.name/notebook/indirect_inference.htmlReferences A.k.a the auxiliary method. AFAICT the same thing as simplified synthetic likelihoods. Maybe the same thing as simulation-based inference.
Here be economists and ecologists.
Maybe this will solve my current weird intractable model issues?
There is an R package for at least some versions of it: pomp
Quoting Cosma:
[…] your model is too complicated for you to appeal to any of the usual estimation methods of statistics.