Stochastic calculus

Itô and friends

Calculus that works, in a certain sense, for random objects, of certain types. Instrumental in stochastic differential equations. This is a popular and well-explored tool and my notes here are not supposed to be tutorials, although they might resemble that sometimes. I simply want to maintain a list of useful definitions because the literature gets messy and ambiguous sometimes.

Simple integrals of noise

Some of the complexity of stochastic integrals derives from needing to preserve causality for processes happening in time, in terms of various flavours of filtration etc. For simple stochastic integrals of deterministic functions we can avoid all that. Robert J. Adler, Taylor, and Worsley (2016), for example, constructs a standard one:

We define a Gaussian noise \(W\) based on spectral density \(\nu\) as a random process defined on the Borel subsets of \(\mathbb{R}^{N}\) such that, for all \(A, B \in \mathcal{B}^{N}\) with \(\nu(A)\) and \(\nu(B)\) finite, \[ \begin{aligned} &W(A) \sim \mathcal{N}(0, \nu(A)) \\ &A \cap B=\emptyset \Rightarrow W(A \cup B)=W(A)+W(B) \text { a.s. }\\ &A \cap B=\emptyset \Rightarrow W(A) \perp W(B). \end{aligned} \] We can think of \(\nu\) as the measure which allocates signal power to spectrum.

Having defined Gaussian noise, we then define the integral \[ \int_{\mathbb{R}^{N}} \varphi(t) W(d t) \] for deterministic \(\varphi\) with \(\int \varphi^{2}(x) \nu(d x)<\infty\). We do this with the “standard machinery”, i.e. We start with simple functions \[ \varphi(t)=\sum_{1}^{n} a_{i} \mathbb{1}_{A_{i}}(t) \] where \(A_{1}, \ldots, A_{n} \subset \mathbb{R}^{N}\) are disjoint, and the \(a_{i}\) are real, and define \[ W(\varphi) \equiv \int_{\mathbb{R}^{N}} \varphi(t) W(d t)=\sum_{1}^{n} a_{i} W\left(A_{i}\right) \] Taking sums of Gaussian RVs leaves us with a Gaussian RV, so \(W(\varphi)\) has zero mean and variance given by \(\sum a_{i}^{2} \nu\left(A_{i}\right)\). Now think of \(W\) as a mapping from simple functions to random variables. We extend it to all functions square integrable with respect to \(\nu\) by taking limits of approximating functions. For functions that are sufficiently nice (e.g. map Lebesgue-measurable sets to Lebesgue-measurable sets, I think that is sufficient?) we can define the limit appropriately. Further extension to general Lévy noise or processes with non-zero mean is not too complicated.

Defining two simple functions on the same sets, \[ \varphi(t)=\sum_{1}^{n} a_{i} \mathbb{1}_{A_{i}}(t), \quad \psi(t)=\sum_{1}^{n} b_{i} \mathbb{1}_{A_{i}}(t) \] we see that \[ \begin{aligned} \mathbb{E}\{W(\varphi) W(\psi)\} &=\mathbb{E}\left\{\sum_{1}^{n} a_{i} W\left(A_{i}\right) \cdot \sum_{1}^{n} b_{i} W\left(A_{i}\right)\right\} \\ &=\sum_{1}^{n} a_{i} b_{i} \mathbb{E}\left\{\left[W\left(A_{i}\right)\right]^{2}\right\} \\ &=\sum_{1}^{n} a_{i} b_{i} \nu\left(A_{i}\right) \\ &=\int_{\mathbb{R}^{N}} \varphi(t) \psi(t) \nu(d t) \end{aligned} \] Taking the limit, we see that also \[ \mathbb{E}\left\{\int_{\mathbb{R}^{N}} \varphi(t) W(d t) \int_{\mathbb{R}^{N}} \psi(t) W(d t)\right\}=\int_{\mathbb{R}^{N}} \varphi(t) \psi(t) \nu(d t).\]

Grand! Except we want to extend this construction to complex random functions also, in which case the following is natural for a centred one.

\[ \begin{aligned} &\mathbb{E}\{W(A)\}=0\\ &\mathbb{E}\{W(A) \overline{W(A)}\}=\nu(A) \\ &A \cap B=\emptyset \Rightarrow W(A \cup B)=W(A)+W(B) \text { a.s. } \\ &A \cap B=\emptyset \Rightarrow \mathbb{E}\{W(A) \overline{W(B)}\}=0. \end{aligned} \]

If we want to make it Gaussian in particular, say, we need to specify the distributions precisely. We think of a complex Gaussian RV as simply a 2-dimensional Gaussian and all the usual rules apply. Note, because these are complex numbers some of the rules are subtly weird — To fully specify the process we need to decompose that covariance into the joint distribution of real and imaginary parts,

\[ \begin{aligned} &\begin{bmatrix} \mathcal{R}(W(A))\\ \mathcal{I}(W(A))\\ \end{bmatrix} \sim \mathcal{N}\left(0, \begin{bmatrix} \nu_{\mathcal{R}^2}(A) & \nu_{\mathcal{IR}}(A)\\ \nu_{\mathcal{IR}}(A) & \nu_{\mathcal{I}^2}(A) \end{bmatrix}\right)\\ &\text{ where }\nu_{\mathcal{R}^2}(A) + \nu_{\mathcal{I}^2}(A) = \nu(A)\\ &\text{ and }\nu_{\mathcal{R}^2}(A)\nu_{\mathcal{I}^2}(A) > \nu_{\mathcal{IR}}^2(A) \text{ (positive-definiteness).} \end{aligned} \]

Once again from Robert J. Adler, Taylor, and Worsley (2016), we get the suggestion that we could think of this stochastic integral as an infinitesimal version of the Karhunen-Loève expansion. Suppose the (eigenvalue, eigenfunction) pairs are \(\{(\lambda_{\omega}, e^{i\langle t, \omega\rangle}); \\omega\}\) and that \(\lambda_{\omega} \neq 0\) for only a countable number of \(\omega \in \mathbb{R}^{N}\). Then the stationary, complex version of the Mercer expansion tells us that \[ K(t)=\sum_{\omega} \lambda_{\omega} e^{i\langle t, \omega\rangle} \] while the Karhunen-Loève expansion becomes \[ f(t)=\sum_{\omega} \lambda_{\omega}^{1 / 2} \xi_{\omega} e^{i\langle t, \omega\rangle} \] Fine so far. But if the basis is not countable it gets weird. \[ K(t)=\int_{\mathbb{R}^{N}} \lambda_{\omega} e^{i\langle t, \omega\rangle} d \omega \] and \[ f(t)=\int_{\mathbb{R}^{N}} \lambda_{\omega}^{1 / 2} \xi_{\omega} e^{i\langle t, \omega\rangle} d \omega. \] Everything is well defined in the first of these integrals, but the second is made ill-defined because \(\omega\) now parameterises an uncountable basis; then how can we have the \(\xi_{\omega}\) independent for each \(\omega\)? We cannot as such, but we can get an analogue of that by interpreting the spectral representation as a stochastic integral, writing \[ f(t)=\int_{\mathbb{R}^{N}} e^{i\langle t, \omega\rangle} W(d \omega) \] where \(W\) is Gaussian \(\nu\)-noise with spectral measure defined by \(\nu(d \omega)=\lambda_{\omega} d \omega .\) This now evokes classic signal processing textbook exercises, at least to me.

Itô Integral


Itô’s lemma

Specifically, let \(X=\left(X^{1}, \ldots, X^{n}\right)\) be a tuple of semimartingales and let \(f: \mathbb{R}^{n} \rightarrow\mathbb{R}\) have continuous second order partial derivatives. Then \(f(X)\) is also a semimartingale and the following formula holds:

\[\begin{aligned} f\left(X_{t}\right) - f\left(X_{0}\right) &= +\sum_{i=1}^{n} \int_{0+}^{t} \frac{\partial f}{\partial x_{i}}\left(X_{s-}\right) \mathrm{d} X_{s}^{i} \\ &\quad +\frac{1}{2} \sum_{1 \leq i, j \leq n} \int_{0+}^{t} \frac{\partial^{2} f}{\partial x_{i} \partial x_{j}}\left(X_{s-}\right) \mathrm{d}\left[X^{i}, X^{j}\right]_{s}^{c} \\ &\quad +\sum_{0<s \leq t}\left(f\left(X_{s}\right)-f\left(X_{s-}\right)-\sum_{i=1}^{n} \frac{\partial f}{\partial x_{i}}\left(X_{s-}\right) \Delta X_{s}^{i}\right) \end{aligned}\]

Here the bracket term is the quadratic variation, \[ [X,Y] := XY-\int X_{s-} \mathrm{d}Y(s)-\int Y_{s-} \mathrm{d}X(s) \]

For a continuous semimartingale, the jump terms are null, and the left limits are equal to the function itself \[\begin{aligned} f\left(X_{t}\right) - f\left(X_{0}\right) &= +\sum_{i=1}^{n} \int_{0+}^{t} \frac{\partial f}{\partial x_{i}}\left(X_{s}\right) \mathrm{d} X_{s}^{i} \\ &+\frac{1}{2} \sum_{1 \leq i, j \leq n} \int_{0+}^{t} \frac{\partial^{2} f}{\partial x_{i} \partial x_{j}}\left(X_{s}\right) \mathrm{d}\left[X^{i}, X^{j}\right]_{s}^{c} \end{aligned}\]

Some authors assume that in Itô calculus the driving noise is not a general semimartingale but a Brownian motion, which is a continuous driving noise. Others use the term Itô calculus to describe a more general version. Sometimes the distinction is made between an Itô diffusion, with a Brownian driving term and a Lévy SDE, which has no implication that the driving term is Brownian. However, this is generally messy and particularly in tutorials by the applied finance people it can be hard to work out which set of definitions they are using.

Stratonovich Integral

Itô integral which can “look forward” infinitesimally in time. Has an alternative justification in terms of rough path.

Doss-Sussman transform

Reducing a stochastic integral to a deterministic one, i.e. replacing the Itô integral with a Lebesgue one. (Sussmann 1978; Karatzas and Ruf 2016).

Assumptions: \(\sigma \in C^{1,1}(\mathbb{R}), \sigma, \sigma^{\prime} \in L^{\infty}, b \in C^{0,1}\) \[ \mathrm{d} X(t)=b(X(s)) \mathrm{d} s+\frac{1}{2} \sigma(X(s)) \sigma^{\prime}(X(s)) \mathrm{d} s+\sigma(X(s)) \mathrm{d} W_{s} \] has a unique (strong) solution \(X=u(W, Y)\) for some \(u \in C^{2}(\mathbb{R})\) and \[ \mathrm{d} Y(t)=f(W(t), Y(t)) \mathrm{d} t \] for some \(f \in C^{0,1}\).

See also Rogers and Williams (1987) section V.28.

Question: does this connect with rough path approaches?

Rough path integrals

See rough path.

Paley-Wiener integral

There is a narrower, but lazier, version of the Itô integral. Jonathan Mattingly introduces it in Paley-Wiener-Zygmund Integral Assuming \(f\) continuous with continuous first derivative and \(f(1)\)=0.

We define the stochastic integral \(\int_{0}^{1} f(t) * \mathrm{d} W(t)\) for these functions by the standard Riemann integral, \[ \int_{0}^{1} f(t) * \mathrm{d} W(t)=-\int_{0}^{1} f^{\prime}(t) W(t) \mathrm{d} t \] Then \[ \mathbf{E}\left[\left(\int_{0}^{1} f(t) * \mathrm{d} W(t)\right)^{2}\right]=\int_{0}^{1} f^{2}(t) \mathrm{d} t. \] Paley, Wiener, and Zygmund then used this isometry to extend the integral to \(f\in L^{2}[0,1]\) as the limit of approximating continuous functions.

What does this get us in terms of SDEs?


Adler, Robert J. 2010. The Geometry of Random Fields. SIAM ed. Philadelphia: Society for Industrial and Applied Mathematics.
Adler, Robert J., and Jonathan E. Taylor. 2007. Random Fields and Geometry. Springer Monographs in Mathematics 115. New York: Springer.
Adler, Robert J, Jonathan E Taylor, and Keith J Worsley. 2016. Applications of Random Fields and Geometry Draft.
Applebaum, David, and Markus Riedle. 2010. “Cylindrical Levy Processes in Banach Spaces.” Proceedings of the London Mathematical Society 101 (3): 697–726.
Arfken, George B., and Hans-Jurgen Weber. 2005. Mathematical Methods for Physicists. 6th ed. Boston: Elsevier.
Arfken, George B., Hans-Jurgen Weber, and Frank E. Harris. 2013. Mathematical Methods for Physicists: A Comprehensive Guide. 7th ed. Amsterdam ; Boston: Elsevier.
Ariffin, Noor Amalina Nisa, and Norhayati Rosli. 2017. “Stochastic Taylor Expansion of Derivative-Free Method for Stochastic Differential Equations.” Malaysian Journal of Fundamental and Applied Sciences 13 (3).
Arnold, LUDWIG, and WOLFGANG Kliemann. 1983. “Qualitative Theory of Stochastic Systems.” In Probabilistic Analysis and Related Topics, edited by A. T. Bharucha-reid, 1–79. Academic Press.
Baudoin, Fabrice. 2014. Diffusion Processes and Stochastic Calculus. EMS Textbooks in Mathematics. Zurich, Switzerland: European Mathematical Society.
Baudoin, Fabrice, and Alice Vatamanelu. n.d. “Stochastic Calculus,” 114.
Bertoin, Jean, Marc Yor, and others. 2001. “On Subordinators, Self-Similar Markov Processes and Some Factorizations of the Exponential Variable.” Electron. Comm. Probab 6 (95): 106.
Coulaud, Benjamin, and Frédéric JP Richard. 2018. “A Consistent Framework for a Statistical Analysis of Surfaces Based on Generalized Stochastic Processes.”
Davis, Mark H. A., Xin Guo, and Guoliang Wu. 2009. “Impulse Control of Multidimensional Jump Diffusions.” arXiv:0912.3297 [math], December.
Goldys, Beniamin, and Szymon Peszat. 2021. “On Linear Stochastic Flows,” May.
Hanson, Floyd B. 2007. “Stochastic Processes and Control for Jump-Diffusions.” SSRN Scholarly Paper ID 1023497. Rochester, NY: Social Science Research Network.
Hassler, Uwe. 2016. Stochastic Processes and Calculus. Springer Texts in Business and Economics. Cham: Springer International Publishing.
Holden, Helge, Bernt Øksendal, Jan Ubøe, and Tusheng Zhang. 1996. Stochastic Partial Differential Equations. Boston, MA: Birkhäuser Boston.
Inchiosa, M. E., and A. R. Bulsara. 1996. “Signal Detection Statistics of Stochastic Resonators.” Physical Review E 53 (3): R2021–24.
Jacod, Jean, and Albert N. Shiryaev. 1987. “The General Theory of Stochastic Processes, Semimartingales and Stochastic Integrals.” In Limit Theorems for Stochastic Processes, edited by Jean Jacod and Albert N. Shiryaev, 1–63. Grundlehren Der Mathematischen Wissenschaften. Berlin, Heidelberg: Springer Berlin Heidelberg.
Kallenberg, Olav. 2002. Foundations of Modern Probability. 2nd ed. Probability and Its Applications. New York: Springer-Verlag.
Karatzas, Ioannis, and Johannes Ruf. 2016. “Pathwise Solvability of Stochastic Integral Equations with Generalized Drift and Non-Smooth Dispersion Functions.” Annales de l’Institut Henri Poincaré, Probabilités Et Statistiques 52 (2): 915–38.
Karczewska, Anna. 2007. “Convolution Type Stochastic Volterra Equations.” arXiv:0712.4357 [math], December.
Kelly, David. 2016. “Rough Path Recursions and Diffusion Approximations.” The Annals of Applied Probability 26 (1).
Kelly, David, and Ian Melbourne. 2014a. “Smooth Approximation of Stochastic Differential Equations,” March.
———. 2014b. “Deterministic Homogenization for Fast-Slow Systems with Chaotic Noise,” September.
Klebaner, Fima C. 1999. Introduction to Stochastic Calculus With Applications. Imperial College Press.
Kloeden, P. E., and E. Platen. 1991. “Stratonovich and Ito Stochastic Taylor Expansions.” Mathematische Nachrichten 151 (1): 33–50.
Kloeden, P. E., E. Platen, and I. W. Wright. 1992. “The Approximation of Multiple Stochastic Integrals.” Stochastic Analysis and Applications 10 (4): 431–41.
Kloeden, Peter E., and Eckhard Platen. 1992. “Stochastic Taylor Expansions.” In Numerical Solution of Stochastic Differential Equations, edited by Peter E. Kloeden and Eckhard Platen, 161–226. Applications of Mathematics. Berlin, Heidelberg: Springer.
Korzeniowski, Andrzej. 1989. “On Diffusions That Cannot Escape from a Convex Set.” Statistics & Probability Letters 8 (3): 229–34.
Kushner, Harold J, and Giovanni DiMasi. 1978. “Approximations for Functionals and Optimal Control Problems on Jump Diffusion Processes.” Journal of Mathematical Analysis and Applications 63 (3): 772–800.
Liu, Xiao, Kyongmin Yeo, and Siyuan Lu. 2020. “Statistical Modeling for Spatio-Temporal Data From Stochastic Convection-Diffusion Processes.” Journal of the American Statistical Association 0 (0): 1–18.
Matheron, G. 1973. “The Intrinsic Random Functions and Their Applications.” Advances in Applied Probability 5 (3): 439–68.
Meidan, R. 1980. “On the Connection Between Ordinary and Generalized Stochastic Processes.” Journal of Mathematical Analysis and Applications 76 (1): 124–33.
Mikosch, Thomas. 1998. Elementary Stochastic Calculus with Finance in View. Advanced Series on Statistical Science & Applied Probability, vol. 6. Singapore ; River Edge, N.J: World Scientific Publ.
———. 2004. Non-Life Insurance Mathematics: An Introduction With Stochastic Processes. Kluwer Academic Publishers.
Mikosch, Thomas, and Rimas Norvaiša. 2000. “Stochastic Integral Equations Without Probability.” Bernoulli 6 (3): 401–34.
Papanicolaou, Andrew. 2019. “Introduction to Stochastic Differential Equations (SDEs) for Finance.” arXiv:1504.05309 [math, q-Fin], January.
Privault, Nicolas. n.d. Notes on Stochastic Finance.
Protter, Philip. 2005. Stochastic Integration and Differential Equations. Springer.
Pugachev, V. S., and I. N. Sinit︠s︡yn. 2001. Stochastic systems: theory and applications. River Edge, NJ: World Scientific.
Revuz, Daniel, and Marc Yor. 2004. Continuous Martingales and Brownian Motion. Springer Science & Business Media.
Rogers, L. C. G., and D. Williams. 2000. Diffusions, Markov Processes, and Martingales. 2nd ed. Cambridge Mathematical Library. Cambridge, U.K. ; New York: Cambridge University Press.
Rogers, L. C. G., and David Williams. 1987. Diffusions, Markov Processes and Martingales 2. Cambridge University Press.
Rößler, Andreas. 2004. “Stochastic Taylor Expansions for the Expectation of Functionals of Diffusion Processes.” Stochastic Analysis and Applications 22 (6): 1553–76.
Schoutens, Wim, K U Leuven, and Michael Studer. 2001. “Stochastic Taylor Expansions for Poisson Processes and Applications Towards Risk Management,” February, 24.
Sussmann, Hector J. 1978. “On the Gap Between Deterministic and Stochastic Ordinary Differential Equations.” The Annals of Probability 6 (1): 19–41.
Tautu, Petre. 2014. Stochastic Spatial Processes. Springer.
Xiu, Dongbin. 2010. Numerical Methods for Stochastic Computations: A Spectral Method Approach. USA: Princeton University Press.
Yaglom, A. M. 1987. Correlation Theory of Stationary and Related Random Functions. Volume II: Supplementary Notes and References. Springer Series in Statistics. New York, NY: Springer Science & Business Media.
Øksendal, Bernt. 1985. Stochastic Differential Equations: An Introduction With Applications. Springer.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.