Generalised Ornstein-Uhlenbeck processes

Markov/AR(1)-like processes

January 10, 2022 — September 21, 2022

dynamical systems
Hilbert space
Lévy processes
probability
regression
signal processing
statistics
stochastic processes
time series

Ornstein-Uhlenbeck-type autoregressive, stationary stochastic processes, e.g. stationary gamma processes, classic Gaussian noise Ornstein-Uhlenbeck processes… There is a family of such induced by every Lévy process via its bridge.

Figure 1

1 Classic Gaussian

1.1 Discrete time

Given a K×K real matrix Φ with all the eigenvalues of Φ in the interval (1,1), and given a sequence εt of multivariate normal variables εtN(0,Σ), with Σ a K×K positive definite symmetric real matrix, the stationary distribution of the process xt=εt+Φxt1=h=0tΦhεth? is given by the Lyapunov equation, or just by basic variance identities. It is Gaussian with N(0,Λ) where the following recurrence relation holds for Λ, Λ=ΦxΦ+Σ. The solution is also, apparently, the limit of a summation Λ=k=0ΦkΣ(Φ)k.

1.2 Continuous time

Suppose we use a Wiener process W as the driving noise in continuous time with some small increment ϵ, dx(t)=ϵAx(t)dt+ϵBdW(t) This is the Ornstein-Uhlenbeck process. If stable, at stationarity it has an analytic stationary density xN(0,Λ) where ΛA+AΛ=ϵBB.

2 Gamma

Over at Gamma processes, Wolpert () notes several example constructions which “look like” Ornstein-Uhlenbeck processes, in that they are stationary-autoregressive, but constructed by different means. Should we look at processes like those here?

For fixed α,β>0 these notes present six different stationary time series, each with Gamma XtGa(α,β) univariate marginal distributions and autocorrelation function ρ|st| for Xs,Xt. Each will be defined on some time index set T, either T=Z or T=R

Five of the six constructions can be applied to other Infinitely Divisible (ID) distributions as well, both continuous ones (normal, α-stable, etc.) and discrete (Poisson, negative binomial, etc). For specifically the Poisson and Gaussian distributions, all but one of them (the Markov change-point construction) coincide— essentially, there is just one “AR(1)-like” Gaussian process (namely, the AR(1) process in discrete time, or the Ornstein-Uhlenbeck process in continuous time), and there is just one AR(1)-like Poisson process. For other ID distributions, however, and in particular for the Gamma, each of these constructions yields a process with the same univariate marginal distributions and the same autocorrelation but with different joint distributions at three or more times.

3 References

Ahn, Korattikara, and Welling. 2012. Bayesian Posterior Sampling via Stochastic Gradient Fisher Scoring.” In Proceedings of the 29th International Coference on International Conference on Machine Learning. ICML’12.
Alexos, Boyd, and Mandt. 2022. Structured Stochastic Gradient MCMC.” In Proceedings of the 39th International Conference on Machine Learning.
Chen, Tianqi, Fox, and Guestrin. 2014. Stochastic Gradient Hamiltonian Monte Carlo.” In Proceedings of the 31st International Conference on Machine Learning.
Chen, Zaiwei, Mou, and Maguluri. 2021. Stationary Behavior of Constant Stepsize SGD Type Algorithms: An Asymptotic Characterization.”
Mandt, Hoffman, and Blei. 2017. Stochastic Gradient Descent as Approximate Bayesian Inference.” JMLR.
Simoncini. 2016. Computational Methods for Linear Matrix Equations.” SIAM Review.
Wolpert. 2021. Lecture Notes on Stationary Gamma Processes.” arXiv:2106.00087 [Math].