Calculus that works, in a certain sense, for random objects, of certain types. Instrumental in stochastic differential equations. This is a popular and well-explored tool and my notes are not supposed to be tutorials, although they might resemble that sometimes. I simply want to maintain a list of useful definitions because the literature gets messy and ambiguous sometimes.

## Simple integrals of noise

Some of the complexity of stochastic integrals derives from needing to preserve causality for processes happening in time, in terms of various flavours of filtration etc. For simple stochastic integrals of deterministic functions we can avoid all that. Robert J. Adler, Taylor, and Worsley (2016), for example, constructs a standard one:

We define a Gaussian noise \(W\) based on spectral density \(\nu\) as a random process defined on the Borel subsets of \(\mathbb{R}^{N}\) such that, for all \(A, B \in \mathcal{B}^{N}\) with \(\nu(A)\) and \(\nu(B)\) finite, \[ \begin{aligned} &W(A) \sim \mathcal{N}(0, \nu(A)) \\ &A \cap B=\emptyset \Rightarrow W(A \cup B)=W(A)+W(B) \text { a.s. }\\ &A \cap B=\emptyset \Rightarrow W(A) \perp W(B). \end{aligned} \] We can think of \(\nu\) as the measure which allocates signal power to spectrum.

Having defined Gaussian noise, we then define the integral
\[
\int_{\mathbb{R}^{N}} \varphi(t) W(d t)
\]
for deterministic \(\varphi\) with \(\int \varphi^{2}(x) \nu(d x)<\infty\).
We do this with the “standard machinery”, i.e. We start with *simple functions*
\[
\varphi(t)=\sum_{1}^{n} a_{i} \mathbb{1}_{A_{i}}(t)
\]
where \(A_{1}, \ldots, A_{n} \subset \mathbb{R}^{N}\) are disjoint, and the \(a_{i}\) are real, and define
\[
W(\varphi) \equiv \int_{\mathbb{R}^{N}} \varphi(t) W(d t)=\sum_{1}^{n} a_{i} W\left(A_{i}\right)
\]
Taking sums of Gaussian RVs leaves us with a Gaussian RV, so \(W(\varphi)\) has zero mean and variance given by \(\sum a_{i}^{2} \nu\left(A_{i}\right)\).
Now think of \(W\) as a mapping from simple functions to random variables.
We extend it to all functions square integrable with respect to \(\nu\) by taking limits of approximating functions.
For functions that are sufficiently nice (e.g. map Lebesgue-measurable sets to Lebesgue-measurable sets, I think that is sufficient?) we can define the limit appropriately.
Further extension to general Lévy noise or processes with non-zero mean is not too complicated.

Defining two simple functions on the same sets, \[ \varphi(t)=\sum_{1}^{n} a_{i} \mathbb{1}_{A_{i}}(t), \quad \psi(t)=\sum_{1}^{n} b_{i} \mathbb{1}_{A_{i}}(t) \] we see that \[ \begin{aligned} \mathbb{E}\{W(\varphi) W(\psi)\} &=\mathbb{E}\left\{\sum_{1}^{n} a_{i} W\left(A_{i}\right) \cdot \sum_{1}^{n} b_{i} W\left(A_{i}\right)\right\} \\ &=\sum_{1}^{n} a_{i} b_{i} \mathbb{E}\left\{\left[W\left(A_{i}\right)\right]^{2}\right\} \\ &=\sum_{1}^{n} a_{i} b_{i} \nu\left(A_{i}\right) \\ &=\int_{\mathbb{R}^{N}} \varphi(t) \psi(t) \nu(d t) \end{aligned} \] Taking the limit, we see that also \[ \mathbb{E}\left\{\int_{\mathbb{R}^{N}} \varphi(t) W(d t) \int_{\mathbb{R}^{N}} \psi(t) W(d t)\right\}=\int_{\mathbb{R}^{N}} \varphi(t) \psi(t) \nu(d t).\]

Grand! Except we want to extend this construction to complex random functions also, in which case the following is natural for a centred one.

\[ \begin{aligned} &\mathbb{E}\{W(A)\}=0\\ &\mathbb{E}\{W(A) \overline{W(A)}\}=\nu(A) \\ &A \cap B=\emptyset \Rightarrow W(A \cup B)=W(A)+W(B) \text { a.s. } \\ &A \cap B=\emptyset \Rightarrow \mathbb{E}\{W(A) \overline{W(B)}\}=0. \end{aligned} \]

If we want to make it Gaussian in particular, say, we need to specify the distributions precisely. We think of a complex Gaussian RV as simply a 2-dimensional Gaussian and all the usual rules apply. Note, because these are complex numbers some of the rules are subtly weird — To fully specify the process we need to decompose that covariance into the joint distribution of real and imaginary parts,

\[ \begin{aligned} &\begin{bmatrix} \mathcal{R}(W(A))\\ \mathcal{I}(W(A))\\ \end{bmatrix} \sim \mathcal{N}\left(0, \begin{bmatrix} \nu_{\mathcal{R}^2}(A) & \nu_{\mathcal{IR}}(A)\\ \nu_{\mathcal{IR}}(A) & \nu_{\mathcal{I}^2}(A) \end{bmatrix}\right)\\ &\text{ where }\nu_{\mathcal{R}^2}(A) + \nu_{\mathcal{I}^2}(A) = \nu(A)\\ &\text{ and }\nu_{\mathcal{R}^2}(A)\nu_{\mathcal{I}^2}(A) > \nu_{\mathcal{IR}}^2(A) \text{ (positive-definiteness).} \end{aligned} \]

Once again from Robert J. Adler, Taylor, and Worsley (2016), we get the suggestion that we could think of this stochastic integral as an infinitesimal version of the Karhunen-Loève expansion. Suppose the (eigenvalue, eigenfunction) pairs are \(\{(\lambda_{\omega}, e^{i\langle t, \omega\rangle}); \\omega\}\) and that \(\lambda_{\omega} \neq 0\) for only a countable number of \(\omega \in \mathbb{R}^{N}\). Then the stationary, complex version of the Mercer expansion tells us that \[ K(t)=\sum_{\omega} \lambda_{\omega} e^{i\langle t, \omega\rangle} \] while the Karhunen-Loève expansion becomes \[ f(t)=\sum_{\omega} \lambda_{\omega}^{1 / 2} \xi_{\omega} e^{i\langle t, \omega\rangle} \] Fine so far. But if the basis is not countable it gets weird. \[ K(t)=\int_{\mathbb{R}^{N}} \lambda_{\omega} e^{i\langle t, \omega\rangle} d \omega \] and \[ f(t)=\int_{\mathbb{R}^{N}} \lambda_{\omega}^{1 / 2} \xi_{\omega} e^{i\langle t, \omega\rangle} d \omega. \] Everything is well defined in the first of these integrals, but the second is made ill-defined because \(\omega\) now parameterises an uncountable basis; then how can we have the \(\xi_{\omega}\) independent for each \(\omega\)? We cannot as such, but we can get an analogue of that by interpreting the spectral representation as a stochastic integral, writing \[ f(t)=\int_{\mathbb{R}^{N}} e^{i\langle t, \omega\rangle} W(d \omega) \] where \(W\) is Gaussian \(\nu\)-noise with spectral measure defined by \(\nu(d \omega)=\lambda_{\omega} d \omega .\) This now evokes classic signal processing textbook exercises, at least to me.

## Itô Integral

TBD.

## Itô’s lemma

Specifically, let \(X=\left(X^{1}, \ldots, X^{n}\right)\) be a tuple of semimartingales and let \(f: \mathbb{R}^{n} \rightarrow\mathbb{R}\) have continuous second order partial derivatives. Then \(f(X)\) is also a semimartingale and the following formula holds:

\[\begin{aligned} f\left(X_{t}\right) - f\left(X_{0}\right) &= +\sum_{i=1}^{n} \int_{0+}^{t} \frac{\partial f}{\partial x_{i}}\left(X_{s-}\right) \mathrm{d} X_{s}^{i} \\ &\quad +\frac{1}{2} \sum_{1 \leq i, j \leq n} \int_{0+}^{t} \frac{\partial^{2} f}{\partial x_{i} \partial x_{j}}\left(X_{s-}\right) \mathrm{d}\left[X^{i}, X^{j}\right]_{s}^{c} \\ &\quad +\sum_{0<s \leq t}\left(f\left(X_{s}\right)-f\left(X_{s-}\right)-\sum_{i=1}^{n} \frac{\partial f}{\partial x_{i}}\left(X_{s-}\right) \Delta X_{s}^{i}\right) \end{aligned}\]

Here the bracket term is the *quadratic variation*,
\[
[X,Y] := XY-\int X_{s-} \mathrm{d}Y(s)-\int Y_{s-} \mathrm{d}X(s)
\]

For a continuous semimartingale, the jump terms are null, and the left limits are equal to the function itself \[\begin{aligned} f\left(X_{t}\right) - f\left(X_{0}\right) &= +\sum_{i=1}^{n} \int_{0+}^{t} \frac{\partial f}{\partial x_{i}}\left(X_{s}\right) \mathrm{d} X_{s}^{i} \\ &+\frac{1}{2} \sum_{1 \leq i, j \leq n} \int_{0+}^{t} \frac{\partial^{2} f}{\partial x_{i} \partial x_{j}}\left(X_{s}\right) \mathrm{d}\left[X^{i}, X^{j}\right]_{s}^{c} \end{aligned}\]

Some authors *assume* that in Itô calculus the driving noise is not a general semimartingale but a Brownian motion, which is a continuous driving noise.
Others use the term *Itô calculus* to describe a more general version.
Sometimes the distinction is made between an *Itô diffusion*, with a Brownian driving term and a *Lévy SDE*, which has no implication that the driving term is Brownian.
However, this is generally messy and particularly in tutorials by the applied finance people it can be hard to work out which set of definitions they are using.

## Itô isometry

TBD. For now, see Quadratic Variations and the Ito Isometry – Almost Sure.

## Stratonovich Integral

Itô integral which can “look forward” infinitesimally in time. Has an alternative justification in terms of rough path.

## Doss-Sussman transform

Reducing a stochastic integral to a deterministic one, i.e. replacing the Itô integral with a Lebesgue one. (Sussmann 1978; Karatzas and Ruf 2016).

Assumptions: \(\sigma \in C^{1,1}(\mathbb{R}), \sigma, \sigma^{\prime} \in L^{\infty}, b \in C^{0,1}\) \[ \mathrm{d} X(t)=b(X(s)) \mathrm{d} s+\frac{1}{2} \sigma(X(s)) \sigma^{\prime}(X(s)) \mathrm{d} s+\sigma(X(s)) \mathrm{d} W_{s} \] has a unique (strong) solution \(X=u(W, Y)\) for some \(u \in C^{2}(\mathbb{R})\) and \[ \mathrm{d} Y(t)=f(W(t), Y(t)) \mathrm{d} t \] for some \(f \in C^{0,1}\).

See also Rogers and Williams (1987) section V.28.

Question: does this connect with rough path approaches?

## Rough path integrals

See rough path.

## Paley-Wiener integral

There is a narrower, but lazier, version of the Itô integral. Jonathan Mattingly introduces it in Paley-Wiener-Zygmund Integral Assuming \(f\) continuous with continuous first derivative and \(f(1)\)=0.

We define the stochastic integral \(\int_{0}^{1} f(t) * \mathrm{d} W(t)\) for these functions by the standard Riemann integral, \[ \int_{0}^{1} f(t) * \mathrm{d} W(t)=-\int_{0}^{1} f^{\prime}(t) W(t) \mathrm{d} t \] Then \[ \mathbf{E}\left[\left(\int_{0}^{1} f(t) * \mathrm{d} W(t)\right)^{2}\right]=\int_{0}^{1} f^{2}(t) \mathrm{d} t. \] Paley, Wiener, and Zygmund then used this isometry to extend the integral to \(f\in L^{2}[0,1]\) as the limit of approximating continuous functions.

What does this get us in terms of SDEs?

## References

*The Geometry of Random Fields*. SIAM ed. Philadelphia: Society for Industrial and Applied Mathematics.

*Random Fields and Geometry*. Springer Monographs in Mathematics 115. New York: Springer.

*Applications of Random Fields and Geometry Draft*.

*Proceedings of the London Mathematical Society*101 (3): 697–726.

*Mathematical Methods for Physicists*. 6th ed. Boston: Elsevier.

*Mathematical Methods for Physicists: A Comprehensive Guide*. 7th ed. Amsterdam ; Boston: Elsevier.

*Malaysian Journal of Fundamental and Applied Sciences*13 (3).

*Probabilistic Analysis and Related Topics*, edited by A. T. Bharucha-reid, 1–79. Academic Press.

*Diffusion Processes and Stochastic Calculus*. EMS Textbooks in Mathematics. Zurich, Switzerland: European Mathematical Society.

*Electron. Comm. Probab*6 (95): 106.

*arXiv:0912.3297 [Math]*, December.

*Stochastic Processes and Calculus*. Springer Texts in Business and Economics. Cham: Springer International Publishing.

*Stochastic Partial Differential Equations*. Boston, MA: Birkhäuser Boston.

*Physical Review E*53 (3): R2021–24.

*Limit Theorems for Stochastic Processes*, edited by Jean Jacod and Albert N. Shiryaev, 1–63. Grundlehren Der Mathematischen Wissenschaften. Berlin, Heidelberg: Springer Berlin Heidelberg.

*Foundations of Modern Probability*. 2nd ed. Probability and Its Applications. New York: Springer-Verlag.

*Annales de l’Institut Henri Poincaré, Probabilités Et Statistiques*52 (2): 915–38.

*arXiv:0712.4357 [Math]*, December.

*The Annals of Applied Probability*26 (1).

*Introduction to Stochastic Calculus With Applications*. Imperial College Press.

*Mathematische Nachrichten*151 (1): 33–50.

*Stochastic Analysis and Applications*10 (4): 431–41.

*Numerical Solution of Stochastic Differential Equations*, edited by Peter E. Kloeden and Eckhard Platen, 161–226. Applications of Mathematics. Berlin, Heidelberg: Springer.

*Statistics & Probability Letters*8 (3): 229–34.

*Journal of Mathematical Analysis and Applications*63 (3): 772–800.

*Journal of the American Statistical Association*0 (0): 1–18.

*Advances in Applied Probability*5 (3): 439–68.

*Journal of Mathematical Analysis and Applications*76 (1): 124–33.

*Elementary Stochastic Calculus with Finance in View*. Advanced Series on Statistical Science & Applied Probability, vol. 6. Singapore ; River Edge, N.J: World Scientific Publ.

*Non-Life Insurance Mathematics: An Introduction With Stochastic Processes*. Kluwer Academic Publishers.

*Bernoulli*6 (3): 401–34.

*Stochastic Differential Equations: An Introduction With Applications*. Springer.

*arXiv:1504.05309 [Math, q-Fin]*, January.

*Notes on Stochastic Finance*.

*Stochastic Integration and Differential Equations*. Springer.

*Stochastic systems: theory and applications*. River Edge, NJ: World Scientific.

*Continuous Martingales and Brownian Motion*. Springer Science & Business Media.

*Diffusions, Markov Processes, and Martingales*. 2nd ed. Cambridge Mathematical Library. Cambridge, U.K. ; New York: Cambridge University Press.

*Diffusions, Markov Processes and Martingales 2*. Cambridge University Press.

*Stochastic Analysis and Applications*22 (6): 1553–76.

*The Annals of Probability*6 (1): 19–41.

*Stochastic Spatial Processes*. Springer.

*Numerical Methods for Stochastic Computations: A Spectral Method Approach*. USA: Princeton University Press.

*Correlation Theory of Stationary and Related Random Functions. Volume II: Supplementary Notes and References*. Springer Series in Statistics. New York, NY: Springer Science & Business Media.

## No comments yet. Why not leave one?