Noise outsourcing

June 3, 2024 — June 3, 2024

approximation
Bayes
density
likelihood free
Monte Carlo
nonparametric
optimization
probabilistic algorithms
probability
sciml
statistics
Figure 1

A useful result from probability theory for, for example, reparameterization, or learning with symmetries.

Bloem-Reddy and Teh ():

Noise outsourcing is a standard technical tool from measure theoretic probability, where it is also known by other names such as transfer […]. For any two random variables X and Y taking values in nice spaces (e.g., Borel spaces), noise outsourcing says that there exists a functional representation of samples from the conditional distribution PYX in terms of X and independent noise: Y= ass f(η,X). […]the relevant property of η is its independence from X, and the uniform distribution could be replaced by any other random variable taking values in a Borel space, for example a standard normal on R, and the result would still hold, albeit with a different f.

Basic noise outsourcing can be refined in the presence of conditional independence. Let S:XS be a statistic such that Y and X are conditionally independent, given S(X) : YS(X)X. The following basic result[…] says that if there is a statistic S that d-separates X and Y, then it is possible to represent Y as a noise-outsourced function of S.

Lemma 5. Let X and Y be random variables with joint distribution PX,Y. Let S be a standard Borel space and S:XS a measurable map. Then S(X)d-separates X and Y if and only if there is a measurable function f:[0,1]×SY such that (X,Y)= as (X,f(η,S(X))) where ηUnif[0,1] and ηX

In particular, Y=f(η,S(X)) has distribution PYX. […]Note that in general, f is measurable but need not be differentiable or otherwise have desirable properties, although for modelling purposes it can be limited to functions belonging to a tractable class (e.g., differentiable, parameterized by a neural network). Note also that the identity map S(X)=X trivially d-separates X and Y, so that Y= as f(η,X), which is standard noise outsourcing (e.g., Austin (), Lem. 3.1).

1 References

Austin. 2015. Exchangeable Random Measures.” Annales de l’Institut Henri Poincaré, Probabilités Et Statistiques.
Bloem-Reddy, and Teh. 2020. Probabilistic Symmetries and Invariant Neural Networks.”
Kallenberg. 2002. Foundations of Modern Probability. Probability and Its Applications.
———. 2017. Random Measures, Theory and Applications.