Lévy Gamma processes

October 14, 2019 — March 3, 2022

Lévy processes
probability
stochastic processes
time series

$\renewcommand{\var}{\operatorname{Var}} \renewcommand{\corr}{\operatorname{Corr}} \renewcommand{\dd}{\mathrm{d}} \renewcommand{\bb}[1]{\mathbb{#1}} \renewcommand{\vv}[1]{\boldsymbol{#1}} \renewcommand{\rv}[1]{\mathsf{#1}} \renewcommand{\vrv}[1]{\vv{\rv{#1}}} \renewcommand{\disteq}{\stackrel{d}{=}} \renewcommand{\gvn}{\mid} \renewcommand{\Ex}{\mathbb{E}} \renewcommand{\Pr}{\mathbb{P}}$

Processes with Gamma marginals. Usually when we discuss Gamma processes we mean Gamma-Lévy processes. Such processes have independent Gamma increments, much like a Wiener process has independent Gaussian increments and a Poisson process has independent Poisson increments. Gamma processes provide the classic subordinator models, i.e. non-decreasing Lévy processes.

There are other processes with Gamma marginals.

OK but if a process’s marginals are “Gamma-distributed”, what does that even mean? First, go and read about Gamma distributions. THEN go and read about Beta and Dirichlet distributions. We need both. And especially the Gamma-Dirichlet algebra.

For more, see the Gamma-Beta notebook.

1 The Lévy-Gamma process

Every divisible distribution induces an associated Lévy process by a standard procedure This works on the Gamma process too.

Ground zero for treating these processes specifically appears to be Ferguson and Klass (1972), and then the weaponisation of these processes to construct the Dirichlet process prior occurs in Ferguson (1974). Tutorial introductions to Gamma(-Lévy) processes can be found in . Existence proofs etc are deferred to those sources. You could also see Wikipedia, although that article was not particularly helpful for me.

The univariate Lévy-Gamma process $$\{\rv{g}(t;\alpha,\lambda)\}_t$$ is an independent-increment process, with time index $$t$$ and parameters by $$\alpha, \lambda.$$ We assume it is started at $$\rv{g}(0)=0$$.

The marginal density $$g(x;t,\alpha, \lambda )$$ of the process at time $$t$$ is a Gamma RV, specifically, $g(x;t, \alpha, \lambda) =\frac{ \lambda^{\alpha t} } { \Gamma (\alpha t) } x^{\alpha t\,-\,1}e^{-\lambda x}, x\geq 0.$ We can think of the Gamma distribution as the distribution at time 1 of a Gamma process.

That is, $$\rv{g}(t) \sim \operatorname{Gamma}(\alpha(t_{i+1}-t_{i}), \lambda)$$. which corresponds to increments per unit time in terms of $$\bb E(\rv{g}(1))=\alpha/\lambda$$ and $$\var(\rv{g}(1))=\alpha/\lambda^2.$$

Aside: Note that the useful special case that $$\alpha t=1,$$ then we have that $$\rv{g}(t;1,\lambda )\sim \operatorname{Exp}(\lambda).$$

The increment distirbution leads to a method for simulating a path of a Gamma process at a sequence of increasing times, $$\{t_1, t_2, t_3, \dots, t_L\}.$$ Given $$\rv{g}(t_1;\alpha, \lambda),$$ we know that the increments are distributed as independent variates $$\rv{g}_i:=\rv{g}(t_{i+1})-\rv{g}(t_{i})\sim \operatorname{Gamma}(\alpha(t_{i+1}-t_{i}), \lambda)$$. Presuming we may simulate from the Gamma distribution, it follows that $\rv{g}(t_i)=\sum_{j < i}\left( \rv{g}(t_{i+1})-\rv{g}(t_{i})\right)=\sum_{j < i} \rv{g}_j.$

1.1 Lévy characterisation

For arguments $$x, t>0$$ and parameters $$\alpha, \lambda>0,$$ we have the increment density as simply a Gamma density:

$p_{X}(t, x)=\frac{\lambda^{\alpha t} x^{\alpha t-1} \mathrm{e}^{-x \lambda}}{ \Gamma(\alpha t)},$

This gives us a spectrally positive Lévy measure $\pi_{\rv{x}}(x)=\frac{\alpha}{x} \mathrm{e}^{-\lambda x}$ and Laplace exponent $\Phi_{\rv{x}}(z)=\alpha \ln (1+ z/\lambda), z \geq 0.$

That is, the Poisson rate, with respect to time $$t$$ of jumps whose size is in the range $$[x, x+dx)$$, is $$\pi(x)dx.$$ We think of this as an infinite superposition of Poisson processes driving different sized jumps, where the jumps are mostly tiny. This is how I think about Lévy process theory, at least.

2 Gamma bridge

A useful associated process. Consider a univariate Gamma-Lévy process, $$\rv{g}(t)$$ with $$\rv{g}(0)=0.$$ The Gamma bridge, analogous to the Brownian bridge, is that process conditionalised upon attaining a fixed the value $$S=\rv{g}(1)$$ at terminal time $$1.$$ We write $$\rv{g}_{S}:=\{\rv{g}(t)\mid \rv{g}(1)=S\}_{0< t < 1}$$ for the paths of this process.

We can simulate from the Gamma bridge easily. Given the increments of the process are independent, if we have a Gamma process $$\rv{g}$$ on the index set $$[0,1]$$ such that $$\rv{g}(1)=S$$, then we can simulate from the bridge paths which connect these points at intermediate time $$t,\, 0<t<1$$ by recalling that we have known distributions for the increments; in particular $$\rv{g}(t)\sim\operatorname{Gamma}(\alpha, \lambda)$$ and $$\rv{g}(1)-\rv{g}(t)\sim\operatorname{Gamma}(\alpha (1-t), \lambda)$$ and these increments, as increments over disjoints sets, are themselves independent. Then, by the Beta thinning, $\frac{\rv{g}(t)}{\rv{g}(1)}\sim\operatorname{Beta}(\alpha t, \alpha(1-t))$ independent of $$\rv{g}(1).$$ We can therefore sample from a path of the bridge $$\rv{g}_{S}(t)$$ for some $$t< 1$$ by simulating $$\rv{g}_{S}(t)=B S,$$ where $$B\sim \operatorname{Beta}(\alpha (t),\alpha (1-t)).$$

For more on that theme, see Barndorff-Nielsen, Pedersen, and Sato (2001), Émery and Yor (2004) or Yor (2007).

3 Completely random measures

Random probability distributions induced by using Gamma-Lévy processes as a CDF. I laboriously reinvented these, bemused that no one seemed to use them, before discovering that they are called “completely random measures” and they are in fact common.

4 Time-warped Lévy-Gamma process

Çinlar (1980) walks us through the mechanics of (deterministically) time-warping Gamma processes, which ends up being not too unpleasant. Predictable stochastic time-warps look like they should be OK. See N. Singpurwalla (1997) for an application. Why bother? Linear superpositions of Gamma processes can be hard work, and sometime the generalisation from time-warping can come out nicer supposedly. 🏗

6 References

Ahrens, and Dieter. 1974. Computing.
———. 1982. Communications of the ACM.
Applebaum. 2004. Notices of the AMS.
———. 2009. Lévy Processes and Stochastic Calculus. Cambridge Studies in Advanced Mathematics 116.
Asmussen, and Glynn. 2007. Stochastic Simulation: Algorithms and Analysis.
Avramidis, L’Ecuyer, and Tremblay. 2003. In Proceedings of the 35th Conference on Winter Simulation: Driving Innovation. WSC ’03.
Barndorff-Nielsen, Maejima, and Sato. 2006. Bernoulli.
Barndorff-Nielsen, Pedersen, and Sato. 2001. Advances in Applied Probability.
Bertoin. 1996. Lévy Processes. Cambridge Tracts in Mathematics 121.
———. 1999. In Lectures on Probability Theory and Statistics: Ecole d’Eté de Probailités de Saint-Flour XXVII - 1997. Lecture Notes in Mathematics.
Bhattacharya, and Waymire. 2009. Stochastic Processes with Applications.
Bondesson. 2012. Generalized Gamma Convolutions and Related Classes of Distributions and Densities. Lecture Notes in Statistics 76.
Buchmann, Kaehler, Maller, et al. 2015. arXiv:1502.03901 [Math, q-Fin].
Chaumont, and Yor. 2012. Exercises in Probability: A Guided Tour from Measure Theory to Random Processes, Via Conditioning.
Çinlar. 1980. Journal of Applied Probability.
Connor, and Mosimann. 1969. “Concepts of Independence for Proportions with a Generalization of the Dirichlet Distribution.” Journal of the American Statistical Association.
Devroye. 1986. Non-uniform random variate generation.
Dufresne. 1998. Advances in Applied Mathematics.
Edwards, Meyer, and Christensen. 2019. Statistics and Computing.
Émery, and Yor. 2004. Publications of the Research Institute for Mathematical Sciences.
Ferguson. 1974. The Annals of Statistics.
Ferguson, and Klass. 1972. The Annals of Mathematical Statistics.
Figueroa-López. 2012. In Handbook of Computational Finance.
Fink. 1997.
Foti, Futoma, Rockmore, et al. 2013. In Artificial Intelligence and Statistics.
Gaver, and Lewis. 1980. Advances in Applied Probability.
Gourieroux, and Jasiak. 2006. Journal of Forecasting.
Griffiths, and Ghahramani. 2011. Journal of Machine Learning Research.
Grigelionis. 2013. Student’s t-Distribution and Related Stochastic Processes. SpringerBriefs in Statistics.
Gupta, and Nadarajah, eds. 2014. Handbook of Beta Distribution and Its Applications.
Gusak, Kukush, Kulik, et al. 2010. Theory of Stochastic Processes : With Applications to Financial Mathematics and Risk Theory. Problem Books in Mathematics.
Hackmann, and Kuznetsov. 2016. The Annals of Applied Probability.
Hjort. 1990. The Annals of Statistics.
Ishwaran, and Zarepour. 2002. Canadian Journal of Statistics.
James, Roynette, and Yor. 2008. Probability Surveys.
Kingman. 1992. Poisson Processes.
Kirch, Edwards, Meier, et al. 2019. Bayesian Analysis.
Kyprianou. 2014. Fluctuations of Lévy Processes with Applications: Introductory Lectures. Universitext.
Lalley. 2007. “Lévy Processes, Stable Processes, and Subordinators.”
Lawrance. 1982. Scandinavian Journal of Statistics.
Lawrence, and Urtasun. 2009. In Proceedings of the 26th Annual International Conference on Machine Learning. ICML ’09.
Lefebvre. 2007. Applied Stochastic Processes. Universitext.
Lin. 2016. “On The Dirichlet Distribution.”
Liou, Su, Chiang, et al. 2011. Stochastic Environmental Research and Risk Assessment.
Lo, and Weng. 1989. Annals of the Institute of Statistical Mathematics.
Mathai. 1982. Annals of the Institute of Statistical Mathematics.
Mathai, and Moschopoulos. 1991. Journal of Multivariate Analysis.
Mathai, and Provost. 2005. Linear Algebra and Its Applications, Tenth Special Issue (Part 2) on Linear Algebra and Statistics,.
Mathal, and Moschopoulos. 1992. Annals of the Institute of Statistical Mathematics.
Meier, Kirch, Edwards, et al. 2019.
Meier, Kirch, and Meyer. 2020. Journal of Multivariate Analysis.
Moschopoulos. 1985. Annals of the Institute of Statistical Mathematics.
Olofsson. 2005. Probability, Statistics, and Stochastic Processes.
Pérez-Abreu, and Stelzer. 2014. Journal of Multivariate Analysis.
Pfaffel. 2012. arXiv:1201.3256 [Math].
Polson, Scott, and Windle. 2013. Journal of the American Statistical Association.
Rao, and Teh. 2009. “Spatial Normalized Gamma Processes.” In Proceedings of the 22nd International Conference on Neural Information Processing Systems. NIPS’09.
Roychowdhury, and Kulis. 2015. In Artificial Intelligence and Statistics.
Rubinstein, and Kroese. 2016. Simulation and the Monte Carlo Method. Wiley series in probability and statistics.
Sato. 1999. Lévy Processes and Infinitely Divisible Distributions.
Semeraro. 2008. International Journal of Theoretical and Applied Finance.
Shah, Wilson, and Ghahramani. 2014. In Artificial Intelligence and Statistics.
Shaked, and Shanthikumar. 1988. Journal of Applied Probability.
Sim. 1990. Journal of Applied Probability.
Singpurwalla, Nozer. 1997. In Engineering Probabilistic Design and Maintenance for Flood Protection.
Singpurwalla, Nozer D., and Youngren. 1993. Scandinavian Journal of Statistics.
Steutel, and van Harn. 2003. Infinite Divisibility of Probability Distributions on the Real Line.
Tankov, and Voltchkova. n.d. “Jump-Diﬀusion Models: A Practitioner’s Guide.”
Thibaux, and Jordan. 2007. In Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics.
Thorin. 1977a. Scandinavian Actuarial Journal.
———. 1977b. Scandinavian Actuarial Journal.
Tracey, and Wolpert. 2018. 2018 AIAA Non-Deterministic Approaches Conference.
van der Weide. 1997. In Engineering Probabilistic Design and Maintenance for Flood Protection.
Veillette, and Taqqu. 2010a. Statistics & Probability Letters.
———. 2010b. Methodology and Computing in Applied Probability.
Walker. 2000. Scandinavian Journal of Statistics.
Wilson, and Ghahramani. 2011. In Proceedings of the Twenty-Seventh Conference on Uncertainty in Artificial Intelligence. UAI’11.
Wolpert, Robert L. 2021. arXiv:2106.00087 [Math].
Wolpert, Robert L., and Brown. 2021. arXiv:2105.14591 [Math].
Wolpert, R., and Ickstadt. 1998. Biometrika.
Xuan, Lu, Zhang, et al. 2015. arXiv:1503.08542 [Cs, Stat].
Yor. 2007. In Advances in Mathematical Finance. Applied and Numerical Harmonic Analysis.