Various multivariate distributions that are marginally Gamma distributed but have correlations. No unifying insight here, this is a grab-bag right now.
How general can the joint distribution of a Gamma vector be? Much effort has been spent on this question (Barndorff-Nielsen, Maejima, and Sato 2006; Barndorff-Nielsen, Pedersen, and Sato 2001; Buchmann et al. 2015; Mathai and Moschopoulos 1991; Mathal and Moschopoulos 1992; Pérez-Abreu and Stelzer 2014; Semeraro 2008; Singpurwalla and Youngren 1993) and several more below. There are lots of slightly different options, is the short version.
So here is the simplest multivariate case:
Bivariate thinned Gamma RVs
There are many possible ways of generating two correlated Gamma variates, but as far as I am concerned the default is the Beta thinning method.
For this and throughout, we use the fact that
Suppose we have and and with all rvs jointly independent. Now we generate Then and and thus Thence
We can use this to generate more than two correlated gamma variates by successively thinning them, as long as we take care to remember that we can only add independent Gamma variates together to produce a new Gamma variate in general (unlike, say, Gaussian processes.)
Multivariate latent thinned Gamma RVs
A different construction.
Suppose we have independent latent Gamma variates. Let us also suppose we have a latent mixing matrix, whose entries are all non-negative and whose rows sum to Now, let us define new random variates where By construction and the Beta thinning rule, we know that What can we say about the correlation between them?
For this, we need the fact that
Expanding out the s,
So and
After all that work, we find pretty much what we would have expected for the correlation of rvs constructed by weighting of latent Gaussian factors. Was there a shortcut that could have gotten us there quicker?
Kingman constructions
Kingman (1967), Lijoi and Prünster (2010). TBD. See
Generalized Multivariate Gamma
Das and Dey (2010) discusses a lineage of Generalized Multivariate Gammas that sound interesting. For them, multivariate appears to mean square-matrix-variate, but in the papers they cite, (Krishnaiah and Rao 1961; Krishnamoorthy and Parthasarathy 1951) it seems to be vector-valued. What is going on here? Am I being boneheaded? Lukacs and Laha (1964) is also referenced; what does that say?
Dussauchoy Multivariate Gamma
Defined in terms of Characteristic Function, to have analogous joint independence properties to the multivariate Gaussian (Dussauchoy and Berland 1975).
Gaver Multivariate Gamma
Some kind of negative-binomial mixture? (Gaver 1970)
Maximally general multivariate Gamma
The paper to rule them all is Pérez-Abreu and Stelzer (2014), wherein we get a (maximally?) general -variate Gamma distribution in terms of two functions and on the unit sphere with respect to the norm . In fact is a Borel measure, so we more properly say it is a function on the Borel subsets of the unit sphere, and must be Borel-measurable. Enough of that; I play loose with measure-theoretic niceties from here on.
Now, let’s match the notation of the original paper. We use the convention that the Fourier transform of a measure on is given in terms of inner product . The following theorem then characterises multivariate Gamma distributions in terms of these Fourier transforms.
Let be an infinitely divisible probability distribution on If there exists a finite measure on the unit sphere … and a Borel-measurable function such that the Fourier transform of the [probability] measure is given for all , then is called a -dimensional Gamma distribution with parameters and , abbreviated distribution. If is constant, we call a -homogeneous -distribution.
Intriguingly the choice of which norm doesn’t seem to matter much; changing the norm just perturbs in a simple fashion. I assume that is easiest in practice; integral of polynomials on those are straightforward at least.
It is not clear to me what families of s and s are interesting from that description. If we could get that inner integral to look polynomial we might do OK.
We do know which ones are admissible, at least:
Let be a finite measure on and a measurable function. Then [the previous construction] defines a Lévy measure and thus there exists a probability distribution if and only if Moreover, holds true. The condition is trivially satisfied, if is bounded away from zero -almost everywhere.
Let us fix and consider that homogeneous case where is constant on the sphere? Then the term with simplifies to a weird complex logarithm on that same sphere
This is a weird creature. Note that everything cancels out if . How flexible is that? If we fix some , we can get a -point gamma correlation by choosing different measures on the sphere in .
References
Barndorff-Nielsen, Pedersen, and Sato. 2001.
“Multivariate Subordination, Self-Decomposability and Stability.” Advances in Applied Probability.
Bladt, and Nielsen. 2010.
“Multivariate Matrix-Exponential Distributions.” Stochastic Models.
Collins, Dasgupta, and Schapire. 2001.
“A Generalization of Principal Components Analysis to the Exponential Family.” In
Advances in Neural Information Processing Systems.
Foti, Futoma, Rockmore, et al. 2013.
“A Unifying Representation for a Class of Dependent Random Measures.” In
Artificial Intelligence and Statistics.
Gaver. 1970.
“Multivariate Gamma Distributions Generated by Mixture.” Sankhyā: The Indian Journal of Statistics, Series A (1961-2002).
Grunwald, Hyndman, and Tedesco. 1996. “A Unified View of Linear AR(1) Models.”
Jun Li, and Dacheng Tao. 2013.
“Simple Exponential Family PCA.” IEEE Transactions on Neural Networks and Learning Systems.
Kingman. 1967.
“Completely Random Measures.” Pacific Journal of Mathematics.
Krishnamoorthy, and Parthasarathy. 1951.
“A Multivariate Gamma-Type Distribution.” The Annals of Mathematical Statistics.
Lijoi, and Prünster. 2010.
“Models Beyond the Dirichlet Process.” In
Bayesian Nonparametrics.
Liou, Su, Chiang, et al. 2011.
“Gamma Random Field Simulation by a Covariance Matrix Transformation Method.” Stochastic Environmental Research and Risk Assessment.
Lukacs, and Laha. 1964. Applications of Characteristic Functions.
Mathai, and Moschopoulos. 1991.
“On a Multivariate Gamma.” Journal of Multivariate Analysis.
Mathai, and Provost. 2005.
“Some Complex Matrix-Variate Statistical Distributions on Rectangular Matrices.” Linear Algebra and Its Applications, Tenth Special Issue (Part 2) on Linear Algebra and Statistics,.
Mohamed, Ghahramani, and Heller. 2008.
“Bayesian Exponential Family PCA.” In
Advances in Neural Information Processing Systems.
Pérez-Abreu, and Stelzer. 2014.
“Infinitely Divisible Multivariate and Matrix Gamma Distributions.” Journal of Multivariate Analysis.
Sato. 1999. Lévy Processes and Infinitely Divisible Distributions.
Semeraro. 2008.
“A Multivariate Variance Gamma Model for Financial Applications.” International Journal of Theoretical and Applied Finance.
Singpurwalla, and Youngren. 1993.
“Multivariate Distributions Induced by Dynamic Environments.” Scandinavian Journal of Statistics.
Walker. 2021.
“On Infinitely Divisible Multivariate Gamma Distributions.” Communications in Statistics - Theory and Methods.
Wolpert, Robert L. 2021.
“Lecture Notes on Stationary Gamma Processes.” arXiv:2106.00087 [Math].