Learning the parameters of a dynamical system in continuous time. I am imagining here that we are thinking about a parametric setting. If we want to learn some non-parametric approximation to dynamics

## Recursive estimation

See recursive identification for generic theory of learning under the distribution shift induced by a moving parameter vector.

## Introductory reading

Rackauckas et al. (2018) and and even some tutorial implementations by the indefatigable Chris Rackauckas, and a whole MIT course. Chris Rackauckas’ lecture notes christen this area “scientific machine learning”.

Learning stochastic partial differential equations where a whole random field evolves in time is something of interest to me; see spatiotemporal nets and spatiotemporal dynamics for more on that theme.

## In PDEs

See differentiable PDE solvers for now.

## General SDEs

## With sparse SDEs

For least-squares system identification see sparse stochastic processes.

## Neural

## Controlled differential equations

TBD

## Method of adjoints

A trick in differentiation which happens to be useful in differentiating likelihood (or other functions) of time evolving systems e.g. Errico (1997).

For now, see the method of adjoints in the autodiff notebook.

## Tools

### Python

Diffrax is a JAX-based library providing numerical differential equation solvers.

Features include:

- ODE/SDE/CDE (ordinary/stochastic/controlled) solvers
- lots of different solvers (including
`Tsit5`

,`Dopri8`

, symplectic solvers, implicit solvers)- vmappable
everything(including the region of integration)- using a PyTree as the state
- dense solutions
- multiple adjoint methods for backpropagation
- support for neural differential equations.

From a technical point of view, the internal structure of the library is pretty cool — all kinds of equations (ODEs, SDEs, CDEs) are solved in a unified way (rather than being treated separately), producing a small tightly-written library.

### Julia

Chris Rauckackas is a veritable wizard with this stuff; read his blog.

Here is a tour of fun tricks with stochastic PDEs. There is a lot of tooling for this; DiffEqOperators … does something. DiffEqFlux (EZ neural ODEs works with Flux and claims to make neural SDE simple.

+1 for Julia here.

## References

*Stochastic Processes and Their Applications*12 (3): 313–26.

*arXiv:1702.05390 [Physics, Stat]*, February.

*arXiv:1404.7456 [Cs, Stat]*, April.

*Journal of Nonlinear Science*29 (4): 1563–1619.

*Proceedings of ICLR*.

*Advances in Neural Information Processing Systems 31*, edited by S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett, 6572–83. Curran Associates, Inc.

*Advances in Neural Information Processing Systems*. Vol. 33.

*arXiv:2012.07244 [Cs]*, March.

*arXiv:1602.05125 [Math, Stat]*, February.

*Bulletin of the American Meteorological Society*78 (11): 2577–92.

*arXiv:1902.10298 [Cs]*, February.

*arXiv:2007.04154 [Cs, q-Fin, Stat]*, July.

*arXiv:1810.01367 [Cs, Stat]*, October.

*Advances in Neural Information Processing Systems*, 34:572–85. Curran Associates, Inc.

*Royal Society Open Science*9 (2): 211823.

*Advances in Neural Information Processing Systems 32*, edited by H. Wallach, H. Larochelle, A. Beygelzimer, F. d Alché-Buc, E. Fox, and R. Garnett, 9847–58. Curran Associates, Inc.

*arXiv:2005.08926 [Cs, Stat]*, November.

*International Conference on Artificial Intelligence and Statistics*, 3870–82. PMLR.

*arXiv:2107.10127 [Math, Stat]*, July.

*Annual Reviews in Control*34 (1): 1–12.

*arXiv:2107.10879 [Physics]*, July.

*arXiv:2107.11253 [Nlin, Physics:physics, Stat]*, July.

*IEEE Transactions on Signal Processing*55 (2): 493–506.

*arXiv:2002.08071 [Cs, Stat]*.

*Probabilistic Engineering Mechanics*57 (July): 14–25.

*arXiv:1612.07197 [Math, Stat]*, December.

*arXiv:1612.09158 [Cs, Stat]*, December.

*arXiv:1812.01892 [Cs]*, December.

*arXiv:2001.04385 [Cs, Math, q-Bio, Stat]*, August.

*arXiv:2109.07573 [Physics]*, September.

*Artificial Neural Networks and Machine Learning – ICANN 2011*, edited by Timo Honkela, Włodzisław Duch, Mark Girolami, and Samuel Kaski, 6792:151–58. Lecture Notes in Computer Science. Berlin, Heidelberg: Springer.

*Applied Stochastic Differential Equations*. Institute of Mathematical Statistics Textbooks 10. Cambridge ; New York, NY: Cambridge University Press.

*arXiv:2103.10153 [Cs, Stat]*, June.

*Advances in Neural Information Processing Systems*.

*arXiv:2007.00016 [Physics]*, January.

*An Introduction to Sparse Stochastic Processes*. New York: Cambridge University Press.

*Nuclear Engineering and Design*79 (3): 281–87.

*Neural Networks*1 (4): 339–56.

*arXiv:1905.10994 [Cs, Stat]*, October.

*Statistical Inference for Stochastic Processes*25 (1): 43–60.

*SIAM Journal on Scientific Computing*42 (2): A639–65.

## No comments yet. Why not leave one?