Often I need a nonparametric representation for a measure over some non-finite index set. We might want to represent a probability, mass, or a rate. I might want this representation to be something flexible and low-assumption, like a Gaussian process. If I want a nonparametric representation of functions, this is not hard; I can simply use a Gaussian process. What can I use for measures? If I am working directly with random distributions of (e.g. probability) mass, then I might want conservation of mass, for example.
Processes that naturally represent mass and measure are a whole field in themselves. Giving a taxonomy is not easy, but the same ingredients and tools tend to recur; Here is a list of pieces that we can plug together to create a random measure.
Completely random measures
See Kingman (1967) for the OG introduction. Foti et al. (2013) summarises:
A completely random measure (CRM) is a distribution over measures on some measurable space \(\left(\Theta, \mathcal{F}_{\Theta}\right)\), such that the masses \(\Gamma\left(A_{1}\right), \Gamma\left(A_{2}\right), \ldots\) assigned to disjoint subsets \(A_{1}, A_{2}, \cdots \in \mathcal{F}_{\Theta}\) by a random measure \(\Gamma\) are independent. The class of completely random measures contains important distributions such as the Beta process, the Gamma process, the Poisson process and the stable subordinator.
AFAICT any subordinator will do, i.e. any a.s. non-decreasing Lévy process.
TBC
Random coefficient polynomials
As seen in random spectral measures. TBC
Other
Various transforms of Gaussian processes seem popular, e.g. squared or exponentiated. These always seem too messy to me.
References
Barbour, A. D. n.d.
“Stein’s Method and Poisson Process Convergence.” Journal of Applied Probability.
Barbour, A.D., and Brown. 1992.
“Stein’s Method and Point Process Approximation.” Stochastic Processes and Their Applications.
Barndorff-Nielsen, and Schmiegel. 2004.
“Lévy-Based Spatial-Temporal Modelling, with Applications to Turbulence.” Russian Mathematical Surveys.
Çinlar. 1979.
“On Increasing Continuous Processes.” Stochastic Processes and Their Applications.
Foti, Futoma, Rockmore, et al. 2013.
“A Unifying Representation for a Class of Dependent Random Measures.” In
Artificial Intelligence and Statistics.
Gil–Leyva, Mena, and Nicoleris. 2020.
“Beta-Binomial Stick-Breaking Non-Parametric Prior.” Electronic Journal of Statistics.
Griffiths, and Ghahramani. 2011.
“The Indian Buffet Process: An Introduction and Review.” Journal of Machine Learning Research.
Higdon. 2002.
“Space and Space-Time Modeling Using Process Convolutions.” In
Quantitative Methods for Current Environmental Issues.
Ishwaran, and James. 2001.
“Gibbs Sampling Methods for Stick-Breaking Priors.” Journal of the American Statistical Association.
Kingman. 1967.
“Completely Random Measures.” Pacific Journal of Mathematics.
Lijoi, and Prünster. 2010.
“Models Beyond the Dirichlet Process.” In
Bayesian Nonparametrics.
Lin. 2016. “On The Dirichlet Distribution.”
Liou, Su, Chiang, et al. 2011.
“Gamma Random Field Simulation by a Covariance Matrix Transformation Method.” Stochastic Environmental Research and Risk Assessment.
Lo, and Weng. 1989.
“On a Class of Bayesian Nonparametric Estimates: II. Hazard Rate Estimates.” Annals of the Institute of Statistical Mathematics.
MacEachern. 2016.
“Nonparametric Bayesian Methods: A Gentle Introduction and Overview.” Communications for Statistical Applications and Methods.
Nieto-Barajas, Prünster, and Walker. 2004.
“Normalized Random Measures Driven by Increasing Additive Processes.” Annals of Statistics.
Paisley, Zaas, Woods, et al. n.d. “A Stick-Breaking Construction of the Beta Process.”
Pandey, and Dukkipati. 2016.
“On Collapsed Representation of Hierarchical Completely Random Measures.” In
International Conference on Machine Learning.
Rao, and Teh. 2009. “Spatial Normalized Gamma Processes.” In Proceedings of the 22nd International Conference on Neural Information Processing Systems. NIPS’09.
Roychowdhury, and Kulis. 2015.
“Gamma Processes, Stick-Breaking, and Variational Inference.” In
Artificial Intelligence and Statistics.
Thibaux, and Jordan. 2007.
“Hierarchical Beta Processes and the Indian Buffet Process.” In
Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics.
Walker, Damien, Laud, et al. 1999.
“Bayesian Nonparametric Inference for Random Distributions and Related Functions.” Journal of the Royal Statistical Society: Series B (Statistical Methodology).
Wolpert, Robert L., and Ickstadt. 1998.
“Simulation of Lévy Random Fields.” In
Practical Nonparametric and Semiparametric Bayesian Statistics. Lecture Notes in Statistics.
Xuan, Lu, and Zhang. 2020.
“A Survey on Bayesian Nonparametric Learning.” ACM Computing Surveys.