# Random change of time

Stochastic processes derived by varying the rate of time’s passage, which is more convenient than I imagined

August 5, 2015 — February 10, 2020

🏗 Various notes on a.e. continuous monotonic random changes of index in order build new processes.

In Warping and registration problems you try to align two or more processes; this can sometimes be an alignment problem, but not necessarily.

## 1 To explore

Lamperti representation for continuous state branching processes,

Ogata’s time rescaling: Intensity estimation for point processes uses this as a statistical test.

Relation to e.g. martingale transforms.

## 2 Subordinator

A subordinator is a non-decreasing Lévy Process taking values on the reals. AFAICS this is precisely a Gamma process. Curiously, upon giving that definition, many proceed to immediately assert that such a process is a model for a random change of time. This is not insane per se, but doesn’t have much narrative momentum, as a Gamma process can model a bunch of other things than time, and it is a weird specialisation in a field that normally tends to excessive generality.

🏗 I should explain why one would bother doing such an arbitrary thing as changing time though; Basically it is because a time-changed Lévy process is still a Lévy process.

## 3 Point process rate transform

As used in point process residual goodness of fit tests.

A summary in :

Knight showed that for any orthogonal sequence of continuous local martingales, by rescaling time for each via its associated predictable process, we form a multivariate sequence of independent standard Brownian motions. Then Meyer extended Knight’s theorem to the case of point processes, showing that given a simple multivariate point process $${N_i ; i = 1, 2, \dots, n}$$, the multivariate point process obtained by rescaling each $$N_i$$ according to its compensator is a sequence of independent Poisson processes, each having intensity 1. Since then, alternative proofs and variations of this result have been given by Papangelou gave the following interpretation in the univariate case:

Roughly, moving in $$[0, \infty)$$ so as to meet expected future points at a rate of one per time unit (given at each instant complete knowledge of the past), we meet them at the times of a Poisson process. […]

Generalizations of Meyer’s result to point processes on $$\mathbb{R}^d$$ have been established by . In each case, the method used has been to focus on one dimension of the point process, and rescale each point along that dimension according to the conditional intensity.

## 4 Going Multivariate

As seen in, e.g. . How does multivariate time work then?

## 5 References

Aalen, and Hoem. 1978. Scandinavian Actuarial Journal.
Applebaum. 2009. Lévy Processes and Stochastic Calculus. Cambridge Studies in Advanced Mathematics 116.
Baddeley, Turner, Møller, et al. 2005. Journal of the Royal Statistical Society: Series B (Statistical Methodology).
Barndorff-Nielsen, Ole E., Pedersen, and Sato. 2001. Advances in Applied Probability.
Barndorff-Nielsen, Ole E, and Shiryaev. 2010. Change of Time and Change of Measure. Advanced Series on Statistical Science & Applied Probability.
Brémaud. 1972. “A Martingale Approach to Point Processes.”
Brown, and Nair. 1988. Journal of Applied Probability.
Caballero, and Chaumont. 2006. Journal of Applied Probability.
Chaumont, Pantí, and Rivero. 2013. Bernoulli.
Cheng, and Stephens. 1989. Biometrika.
Çinlar. 1980. Journal of Applied Probability.
Cox. 1955. Journal of the Royal Statistical Society: Series B (Methodological).
Giesecke, Kakavand, and Mousavi. 2008. In Simulation Conference, 2008. WSC 2008. Winter.
Haslinger, Pipa, and Brown. 2010. Neural Computation.
Knight. 1970. The Annals of Mathematical Statistics.
Kurtz. 1980. The Annals of Probability.
Lamperti. 1958. Pacific J. Math.
———. 1967. Bull. Amer. Math. Soc.
———. 1972. Zeitschrift Für Wahrscheinlichkeitstheorie Und Verwandte Gebiete.
Leonenko, Meerschaert, Schilling, et al. 2014. Communications in Applied and Industrial Mathematics.