Measure-valued random variates

Including completely random measures and many generalizations



Often I need to have a nonparametric representation for a measure over some non-finite index set. We might want to represent a probability, or mass, or a rate. I might want this representation to be something flexible and low-assumption, like a Gaussian process. If I want a nonparametric representation of functions this is not hard; I can simply use a Gaussian process. What can I use for measures? If I am working directly with random distributions of (e.g. probability) mass then I might want conservation of mass, for example.

Processes are that naturally represent mass and measure are a whole field in themselves. Giving a taxonomy is not easy, but the same ingredients and tools tend to recur; Here is a list of pieces that we can plug together to create a random measure.

Completely random measures

See Kingman (1967) for the OG introduction. Foti et al. (2013) summarises:

A completely random measure (CRM) is a distribution over measures on some measurable space \(\left(\Theta, \mathcal{F}_{\Theta}\right)\), such that the masses \(\Gamma\left(A_{1}\right), \Gamma\left(A_{2}\right), \ldots\) assigned to disjoint subsets \(A_{1}, A_{2}, \cdots \in \mathcal{F}_{\Theta}\) by a random measure \(\Gamma\) are independent. The class of completely random measures contains important distributions such as the Beta process, the Gamma process, the Poisson process and the stable subordinator.

AFAICT any subordinator will in do, i.e. any a.s. non-decreasing Lévy process.

TBC

Dirichlet processes

Random locations plus random weights gives us a Dirichlet process. Breaking sticks, or estimation of probability distributions using the Dirichlet process. I should work out how to sample from the posterior of these. Presumably the Gibbs sampler from Ishwaran and James (2001) is the main trick.

Using Gamma processes

Random coefficient polynomials

As seen in random spectral measures. TBC

For categorical variables

A classic.

Pitman-Yor

Indian Buffet process

Beta process

As seen, apparently, in survival analysis (Hjort 1990; Thibaux and Jordan 2007).

Other

Various transforms of Gaussian processes seem popular, e.g. squared or exponentiated. These always seem too messy to me.

References

Barbour, A. D. n.d. Stein’s Method and Poisson Process Convergence.” Journal of Applied Probability 25 (A): 175–84.
Barbour, A.D., and T.C. Brown. 1992. Stein’s Method and Point Process Approximation.” Stochastic Processes and Their Applications 43 (1): 9–31.
Barndorff-Nielsen, O. E., and J. Schmiegel. 2004. Lévy-Based Spatial-Temporal Modelling, with Applications to Turbulence.” Russian Mathematical Surveys 59 (1): 65.
Çinlar, E. 1979. On Increasing Continuous Processes.” Stochastic Processes and Their Applications 9 (2): 147–54.
Foti, Nicholas, Joseph Futoma, Daniel Rockmore, and Sinead Williamson. 2013. A Unifying Representation for a Class of Dependent Random Measures.” In Artificial Intelligence and Statistics, 20–28.
Gil–Leyva, María F., Ramsés H. Mena, and Theodoros Nicoleris. 2020. Beta-Binomial Stick-Breaking Non-Parametric Prior.” Electronic Journal of Statistics 14 (1): 1479–1507.
Griffiths, Thomas L., and Zoubin Ghahramani. 2011. The Indian Buffet Process: An Introduction and Review.” Journal of Machine Learning Research 12 (32): 1185–1224.
Higdon, Dave. 2002. Space and Space-Time Modeling Using Process Convolutions.” In Quantitative Methods for Current Environmental Issues, edited by Clive W. Anderson, Vic Barnett, Philip C. Chatwin, and Abdel H. El-Shaarawi, 37–56. London: Springer.
Hjort, Nils Lid. 1990. Nonparametric Bayes Estimators Based on Beta Processes in Models for Life History Data.” The Annals of Statistics 18 (3): 1259–94.
Ishwaran, Hemant, and Lancelot F James. 2001. Gibbs Sampling Methods for Stick-Breaking Priors.” Journal of the American Statistical Association 96 (453): 161–73.
James, Lancelot F. 2005. Bayesian Poisson Process Partition Calculus with an Application to Bayesian Lévy Moving Averages.” Annals of Statistics 33 (4): 1771–99.
Kingman, John. 1967. Completely Random Measures.” Pacific Journal of Mathematics 21 (1): 59–78.
Kirch, Claudia, Matthew C. Edwards, Alexander Meier, and Renate Meyer. 2019. Beyond Whittle: Nonparametric Correction of a Parametric Likelihood with a Focus on Bayesian Time Series Analysis.” Bayesian Analysis 14 (4): 1037–73.
Lau, John W., and Edward Cripps. 2022. Thinned Completely Random Measures with Applications in Competing Risks Models.” Bernoulli 28 (1): 638–62.
Lee, Juho, Xenia Miscouridou, and François Caron. 2019. A Unified Construction for Series Representations and Finite Approximations of Completely Random Measures.” arXiv:1905.10733 [Cs, Math, Stat], May.
Lijoi, Antonio, Bernardo Nipoti, and Igor Prünster. 2014. Bayesian Inference with Dependent Normalized Completely Random Measures.” Bernoulli 20 (3): 1260–91.
Lijoi, Antonio, and Igor Prünster. 2010. Models Beyond the Dirichlet Process.” In Bayesian Nonparametrics, edited by Nils Lid Hjort, Chris Holmes, Peter Muller, and Stephen G. Walker, 80–136. Cambridge: Cambridge University Press.
Lin, Jiayu. 2016. “On The Dirichlet Distribution,” 75.
Liou, Jun-Jih, Yuan-Fong Su, Jie-Lun Chiang, and Ke-Sheng Cheng. 2011. Gamma Random Field Simulation by a Covariance Matrix Transformation Method.” Stochastic Environmental Research and Risk Assessment 25 (2): 235–51.
Lo, Albert Y., and Chung-Sing Weng. 1989. On a Class of Bayesian Nonparametric Estimates: II. Hazard Rate Estimates.” Annals of the Institute of Statistical Mathematics 41 (2): 227–45.
MacEachern, Steven N. 2016. Nonparametric Bayesian Methods: A Gentle Introduction and Overview.” Communications for Statistical Applications and Methods 23 (6): 445–66.
Meier, Alexander. 2018. A matrix Gamma process and applications to Bayesian analysis of multivariate time series.”
Meier, Alexander, Claudia Kirch, and Renate Meyer. 2020. Bayesian Nonparametric Analysis of Multivariate Time Series: A Matrix Gamma Process Approach.” Journal of Multivariate Analysis 175 (January): 104560.
Nieto-Barajas, Luis E., Igor Prünster, and Stephen G. Walker. 2004. Normalized Random Measures Driven by Increasing Additive Processes.” Annals of Statistics 32 (6): 2343–60.
Paisley, John, Aimee Zaas, Christopher W Woods, Geoffrey S Ginsburg, and Lawrence Carin. n.d. “A Stick-Breaking Construction of the Beta Process,” 8.
Pandey, Gaurav, and Ambedkar Dukkipati. 2016. On Collapsed Representation of Hierarchical Completely Random Measures.” In International Conference on Machine Learning, 1605–13. PMLR.
Ranganath, Rajesh, and David M. Blei. 2018. Correlated Random Measures.” Journal of the American Statistical Association 113 (521): 417–30.
Rao, Vinayak, and Yee Whye Teh. 2009. “Spatial Normalized Gamma Processes.” In Proceedings of the 22nd International Conference on Neural Information Processing Systems, 1554–62. NIPS’09. Red Hook, NY, USA: Curran Associates Inc.
Renesse, Max-K. von von. 2005. Two Remarks on Completely Random Priors.” Technische Universität Berlin.
Roychowdhury, Anirban, and Brian Kulis. 2015. Gamma Processes, Stick-Breaking, and Variational Inference.” In Artificial Intelligence and Statistics, 800–808. PMLR.
Thibaux, Romain, and Michael I. Jordan. 2007. Hierarchical Beta Processes and the Indian Buffet Process.” In Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, 564–71. PMLR.
Walker, Stephen G., Paul Damien, PuruShottam W. Laud, and Adrian F. M. Smith. 1999. Bayesian Nonparametric Inference for Random Distributions and Related Functions.” Journal of the Royal Statistical Society: Series B (Statistical Methodology) 61 (3): 485–527.
Wolpert, R., and Katja Ickstadt. 1998. Poisson/Gamma Random Field Models for Spatial Statistics.” Biometrika 85 (2): 251–67.
Wolpert, Robert L., and Katja Ickstadt. 1998. Simulation of Lévy Random Fields.” In Practical Nonparametric and Semiparametric Bayesian Statistics, edited by Dipak Dey, Peter Müller, and Debajyoti Sinha, 227–42. Lecture Notes in Statistics. New York, NY: Springer.
Xuan, Junyu, Jie Lu, and Guangquan Zhang. 2020. A Survey on Bayesian Nonparametric Learning.” ACM Computing Surveys 52 (1): 1–36.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.