Markov bridge processes

Especially Lévy bridges


A bridge process for some time-indexed Markov process $$\{\rv{x}(t)\}_{t\in[0,T]}$$ is obtained from that process by conditioning it to attain a fixed value at the final time $$\rv{x}(T)=Y$$ starting from $$\rv{x}(0)=X$$, on an interval $$[0,T]$$. We write that as $$\{\rv{x}(t)\mid \rv{x}(0)=X,\rv{x}(T)=Y\}_{t\in[0,T]}.$$ Put another way, given the starting and finishing values of a stochastic markov process, I would like to rewind time to find out the values of its path at a midpoint which is “compatible” with the endpoints.1 Or, if we jump back then forward again, we can construct generalised autoregressive processes.

Lévy bridges

I am mostly interested in bridges for Lévy processes in particular, rather than arbitrary Markov processes.

It is trivial (more or less) to derive the properties of the bridge for a Wiener process, so that can be found in every stochastic processes textbook. There is an introduction for Lévy processes in . TODO: check that it permits processes with jumps.

Fitzsimmons, Pitman, and Yor (1993) describes a general assortment of methods for handling the properties of bridges, and Perman, Pitman, and Yor (1992) specialises them on pure jump processes (Poisson, gamma).

🏗

In fact “bridge” is a terrible metaphor for these processes and it has simplified nothing for me to use this image as an illustration I am so sorry.

When is a Lévy bridge distribution computationally tractable?

I know it is simple for Gamma, Brownian, and Poisson Lévy processes. Are there others? Yor (2007) asserts that among the Lévy processes, only Brownian and Gamma processes have closed form expressions for the bridge. That is not right; a Poisson process has a simple bridge (binomial sub-division of the increment). Also, both Gamma and Brownian can admit a deterministic drift term.

However, I cannot think of other easy examples. Compound Poisson processes, for example, will have a nasty integral term over an unbounded number of dimensions if we do it the obvious way.

Surely there are proofs about this? I have half a mind to see if I cannot find a classification of processes admitting tractable bridges with my rudimentary variational calculus skills.

A classic. TBC.

General Markov processes

Alexandre Hoang Thiery describes Doob’s h-transform method for general Markov and notes that while it might be good for deducing certain properties of the paths of such a conditioned process, it is not necessarily a good way of sampling it. Although if you happen to get an analytic form that way, it is good for both.

For a fairly old and simple concept, work in the area skews surprisingly recent (i.e post 1990), given that applications seem to cite the early, gruelling as the foundation; perhaps for about 30 years no-one wanted to learn enough potential theory to make it go. Or perhaps I am missing something because of e.g. terminology drift.

References

Bertoin, Jean. 1996. Lévy Processes. Cambridge Tracts in Mathematics 121. Cambridge ; New York: Cambridge University Press.
Chung, Kai Lai, and John B. Walsh, eds. 2005. H-Transforms.” In Markov Processes, Brownian Motion, and Time Symmetry, 320–35. Grundlehren Der Mathematischen Wissenschaften. New York, NY: Springer.
Chung, Kai Lai, John B. Walsh, and Kai Lai Chung. 2005. Markov Processes, Brownian Motion, and Time Symmetry. 2nd ed. Grundlehren Der Mathematischen Wissenschaften 249. Berlin ; New York: Springer.
Connell, Neil O. 2003. Journal of Physics A: Mathematical and General 36 (12): 3049–66.
Dembo, Amir. 2013.
Doob, J. L. 1957. Bulletin de la Société Mathématique de France 85: 431–58.
Dufresne, Daniel. 1998. Advances in Applied Mathematics 20 (3): 285–99.
Émery, Michel, and Marc Yor. 2004. Publications of the Research Institute for Mathematical Sciences 40 (3): 669–88.
Fitzsimmons, Pat, Jim Pitman, and Marc Yor. 1993. In Seminar on Stochastic Processes, 1992, edited by E. Çinlar, K. L. Chung, M. J. Sharpe, R. F. Bass, and K. Burdzy, 101–34. Progress in Probability. Boston, MA: Birkhäuser Boston.
Jacod, Jean, and Philip Protter. 1988. The Annals of Probability 16 (2): 620–41.
Janson, Svante. 2011. Eprint arXiv:1112.0220, December.
Levin, David Asher, Y. Peres, and Elizabeth L. Wilmer. 2009. Markov Chains and Mixing Times. Providence, R.I: American Mathematical Society.
O’Connell, Neil. 2003. Transactions of the American Mathematical Society 355 (9): 3669–97.
Perman, Mihael, Jim Pitman, and Marc Yor. 1992. Probability Theory and Related Fields 92 (1): 21–39.
Privault, Nicolas, and Jean-Claude Zambrini. 2004. Annales de l’Institut Henri Poincare (B) Probability and Statistics 40 (5): 599–633.
Steutel, F. W., and K. van Harn. 1979. The Annals of Probability 7 (5): 893–99.
Steutel, Fred W., and Klaas van Harn. 2003. Infinite Divisibility of Probability Distributions on the Real Line. Boca Raton: CRC Press.
Wolpert, Robert L. 2021. arXiv:2106.00087 [Math], May.
Yor, Marc. 2007. In Advances in Mathematical Finance, edited by Michael C. Fu, Robert A. Jarrow, Ju-Yi J. Yen, and Robert J. Elliott, 37–47. Applied and Numerical Harmonic Analysis. Birkhäuser Boston.

1. There is a more general version given in where the initial and final states of the process are permitted to have distributions rather than fixed values.↩︎

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.