# Poisson point processes

October 14, 2019 — January 29, 2020

Lévy processes
point processes
probability
spatial
stochastic processes

$\renewcommand{\var}{\operatorname{Var}} \renewcommand{\dd}{\mathrm{d}} \renewcommand{\pd}{\partial} \renewcommand{\bb}[1]{\mathbb{#1}} \renewcommand{\bf}[1]{\mathbf{#1}} \renewcommand{\vv}[1]{\boldsymbol{#1}} \renewcommand{\mm}[1]{\mathrm{#1}} \renewcommand{\cc}[1]{\mathcal{#1}} \renewcommand{\oo}[1]{\operatorname{#1}} \renewcommand{\gvn}{\mid} \renewcommand{\II}{\mathbb{I}} \renewcommand{\Ex}{\mathbb{E}} \renewcommand{\Pr}{\mathbb{P}}$

Poisson processes are possibly the simplest subordinators, i.e. non-decreasing Lévy processes. They pop up everywhere, especially as a representation of point processes, and as the second continuous-time stochastic process anyone learns after the Brownian motion.

## 1 Basics

A Poisson process $$\{\mathsf{n}\}\sim \operatorname{PoisP}(\lambda)$$ is a stochastic process whose inter-occurrence times are identically and independently distributed such that $$\mathsf{t}_i-\mathsf{t}_{i-1}\sim\operatorname{Exp}(\lambda)$$ (rate parameterization). By convention, we set all $$\mathsf{t}_i\geq 0$$ and $$\mathsf{n}(0)=0$$ a.s. The counting process that increases for each event is one convenient representation for such a process. In that case we think of the paths of the process as $$\mathsf{n}: \mathbb{R}\mapsto\mathbb{Z}^+$$ such that $$\mathsf{n}(t)\equiv \sum_{i=1}^\mathsf{n}\mathbb{I}_{\{\mathsf{t}_i<t\}}$$.

The following is a representation of such a creature based on Matt Motoki’s answer on stackoverflow.

Code
set.seed(1234)
lambda <- 0.3
last_arrival <- 0
arrival_times <- c()
n <- 10
counts = 1:n
for (i in counts) {
last_arrival <- rexp(1, rate = lambda) + last_arrival
arrival_times <- c(arrival_times,last_arrival)
}

plot(arrival_times, counts, pch=16, ylim=c(0, n))
points(arrival_times, c(0, counts[-n]))
segments(
x0 = c(0, arrival_times[-n]),
y0 = c(0, counts[-n]),
x1 = arrival_times,
y1 = c(0, counts[-n])
)

It is easy to show that $$\mathsf{n}(t)\sim\textrm{Pois}(\lambda t)$$, where the pmf of a Poisson RV is

$f(k;\lambda)={\frac {\lambda^{n}}{n!}}e^{-\lambda}.$

It is a standard result that the increments of such processes over disjoint intervals such process have a Poisson distribution also. For $$t \geq s$$:

$\mathsf{n}(t)-\mathsf{n}(s) \sim \textrm{Pois}\left((t-s)\lambda\right).$

Note also the standard result that

$\lambda:=\lim_{h\to 0} \frac{\mathrm E\left(\mathsf{n}(t,t+h)\right)}{h}.$

We call $$\lambda$$ the rate.

## 2 Poisson distribution

### 2.1 Moments

Using the moment generating function

$\Ex[\exp(s\mathsf{n})]= \exp[\lambda (e^{s}-1)]$

we can find the moments by differentiating,

\begin{aligned} \frac{\dd}{\dd s}\Ex[\exp(s\mathsf{n})] &=\frac{\dd}{\dd s} \exp[\lambda (e^{s}-1)]\\ &=\lambda e^{s} \exp[\lambda (e^{s}-1)]\\ \end{aligned}

and thus

\begin{aligned} \Ex[\mathsf{n}] &= \lambda\\ \Ex[\mathsf{n}^2] &= \lambda+\lambda^2\\ \Ex[\mathsf{n}^3] &= \lambda+\lambda^2 + \lambda^3\\ \Ex[\mathsf{n}^4] &= \lambda+7\lambda^2+6\lambda^3+\lambda^4\\ \dots\\ \Ex[\mathsf{n}^k] &=\sum _{i=0}^{k}\lambda ^{i}\left\{{\begin{matrix}k\\i\end{matrix}}\right\}, \end{aligned}

where $$\left\{{\begin{matrix}k\\i\end{matrix}}\right\}$$ is a Touchard/Bell polynomial, whatever that is.

## 3 Poisson bridge

Suppose we are given the value of a Poisson process at time $$0$$ and time $$1$$ and are concerned with some $$t\in(0,1).$$ We wish to know the conditional distribution $$P(\mathsf{n}(t)|\mathsf{n}(1)=S)$$, i.e. the Poisson bridge. Using the distributions of the increments and the independence property,

\begin{aligned} P[\mathsf{n}(t)=n|\mathsf{n}(1)=S] &=\frac{P[\mathsf{n}(t)=n\cap \mathsf{n}(1)=S]}{P[\mathsf{n}(1)=S]}\\ &=\frac{P[\mathsf{n}(t)=n\cap \tilde{\mathsf{n}}(1-t)=S-n]}{P[\mathsf{n}(1)=S]}\\ &=\frac{ \frac{(\lambda t)^{n}}{n!}e^{-\lambda t} \frac{(\lambda (1-t)^{S-n})}{(S-n)!}e^{-\lambda (1-t)} }{ \frac{\lambda^S }{S!}e^{-\lambda } }\\ &=\frac{(\lambda t)^{n}(\lambda (1-t))^{S-n}}{\lambda^{S}} \frac{e^{-\lambda t}e^{-\lambda (1-t)}}{e^{-\lambda}} \frac{S!}{(S-n)!n!}\\ &=(t)^{n} (1-t)^{S-n} \left(\begin{array}{cc}S\\n\end{array}\right)\\ &=\operatorname{Binom}(n;S, t). \end{aligned} So we can simulate a point Poisson bridge at some $$t<1$$ by sampling a Binomial random variable.

## 4 Hitting time

Consider the hitting time of a level $$\ell\in \bb{N}$$ for a Poisson process $$\mathsf{n}$$.

\begin{aligned} \bb{P}\left[\mathsf{n}(t)<\ell\right] &=\bb{P}\left[\inf_s [\mathsf{n}(s) \geq \ell] >t\right]\\ &=\bb{P}\left[ \mathsf{t}_{\ell} >t\right]\\ &=\bb{P}\left[ \left(\sum_{1<k\leq \ell }\mathsf{t}_{k}-\mathsf{t}_{k-1} \right)>t\right]. \end{aligned}

But since $$\mathsf{t}_{k}-\mathsf{t}_{k-1} \sim \operatorname{Exp}(\lambda),$$ and i.i.d. and the sum of i.i.d. Exponential variates is a Gamma variate, we have that

$\mathsf{t}_\ell \sim \operatorname{Gamma}(\ell, \lambda)$

and hence the density of this random variable is the Gamma distribution density,

$f(t)={\frac{\ell ^{\lambda}}{\Gamma(\lambda )}}x^{\lambda -1}e^{-\ell x}.$

The Gamma process is a kind of dual to the Poisson; we can think of it as a distribution over specific hitting times of an imaginary latent Poisson process.

## 5 Taylor expansion

See Schoutens, Leuven, and Studer (2001).

## 6 References

Giles. 2016. ACM Trans. Math. Softw.
Kingman. 1992. Poisson Processes.
Schoutens, Leuven, and Studer. 2001.