# Measure-valued random variates

Including completely random measures and many generalizations

October 16, 2020 — March 30, 2022

functional analysis
kernel tricks
machine learning
nonparametric
PDEs
physics
point processes
probability
regression
SDEs
spatial
statistics
stochastic processes
uncertainty

Often I need to have a nonparametric representation for a measure over some non-finite index set. We might want to represent a probability, or mass, or a rate. I might want this representation to be something flexible and low-assumption, like a Gaussian process. If I want a nonparametric representation of functions this is not hard; I can simply use a Gaussian process. What can I use for measures? If I am working directly with random distributions of (e.g. probability) mass then I might want conservation of mass, for example.

Processes are that naturally represent mass and measure are a whole field in themselves. Giving a taxonomy is not easy, but the same ingredients and tools tend to recur; Here is a list of pieces that we can plug together to create a random measure.

## 1 Completely random measures

See Kingman (1967) for the OG introduction. Foti et al. (2013) summarises:

A completely random measure (CRM) is a distribution over measures on some measurable space $$\left(\Theta, \mathcal{F}_{\Theta}\right)$$, such that the masses $$\Gamma\left(A_{1}\right), \Gamma\left(A_{2}\right), \ldots$$ assigned to disjoint subsets $$A_{1}, A_{2}, \cdots \in \mathcal{F}_{\Theta}$$ by a random measure $$\Gamma$$ are independent. The class of completely random measures contains important distributions such as the Beta process, the Gamma process, the Poisson process and the stable subordinator.

AFAICT any subordinator will in do, i.e. any a.s. non-decreasing Lévy process.

TBC

## 2 Dirichlet processes

Random locations plus random weights gives us a Dirichlet process. Breaking sticks, or estimation of probability distributions using the Dirichlet process. I should work out how to sample from the posterior of these. Presumably the Gibbs sampler from Ishwaran and James (2001) is the main trick.

## 4 Random coefficient polynomials

As seen in random spectral measures. TBC

## 8 Beta process

As seen, apparently, in survival analysis .

## 9 Other

Various transforms of Gaussian processes seem popular, e.g. squared or exponentiated. These always seem too messy to me.

## 12 References

Barbour, A. D. n.d. Journal of Applied Probability.
Barbour, A.D., and Brown. 1992. Stochastic Processes and Their Applications.
Barndorff-Nielsen, and Schmiegel. 2004. Russian Mathematical Surveys.
Çinlar. 1979. Stochastic Processes and Their Applications.
Foti, Futoma, Rockmore, et al. 2013. In Artificial Intelligence and Statistics.
Gil–Leyva, Mena, and Nicoleris. 2020. Electronic Journal of Statistics.
Griffiths, and Ghahramani. 2011. Journal of Machine Learning Research.
Higdon. 2002. In Quantitative Methods for Current Environmental Issues.
Hjort. 1990. The Annals of Statistics.
Ishwaran, and James. 2001. Journal of the American Statistical Association.
James. 2005. Annals of Statistics.
Kingman. 1967. Pacific Journal of Mathematics.
Kirch, Edwards, Meier, et al. 2019. Bayesian Analysis.
Lau, and Cripps. 2022. Bernoulli.
Lee, Miscouridou, and Caron. 2019. arXiv:1905.10733 [Cs, Math, Stat].
Lijoi, Nipoti, and Prünster. 2014. Bernoulli.
Lijoi, and Prünster. 2010. In Bayesian Nonparametrics.
Lin. 2016. “On The Dirichlet Distribution.”
Liou, Su, Chiang, et al. 2011. Stochastic Environmental Research and Risk Assessment.
Lo, and Weng. 1989. Annals of the Institute of Statistical Mathematics.
MacEachern. 2016. Communications for Statistical Applications and Methods.
Meier, Kirch, and Meyer. 2020. Journal of Multivariate Analysis.
Nieto-Barajas, Prünster, and Walker. 2004. Annals of Statistics.
Paisley, Zaas, Woods, et al. n.d. “A Stick-Breaking Construction of the Beta Process.”
Pandey, and Dukkipati. 2016. In International Conference on Machine Learning.
Ranganath, and Blei. 2018. Journal of the American Statistical Association.
Rao, and Teh. 2009. “Spatial Normalized Gamma Processes.” In Proceedings of the 22nd International Conference on Neural Information Processing Systems. NIPS’09.
Roychowdhury, and Kulis. 2015. In Artificial Intelligence and Statistics.
Thibaux, and Jordan. 2007. In Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics.
von Renesse. 2005.
Walker, Damien, Laud, et al. 1999. Journal of the Royal Statistical Society: Series B (Statistical Methodology).
Wolpert, Robert L., and Ickstadt. 1998. In Practical Nonparametric and Semiparametric Bayesian Statistics. Lecture Notes in Statistics.
Wolpert, R., and Ickstadt. 1998. Biometrika.
Xuan, Lu, and Zhang. 2020. ACM Computing Surveys.