Method of Adjoints for differentiating through ODEs
September 15, 2017 — May 15, 2023
Bayes
dynamical systems
linear algebra
probability
signal processing
state space models
statistics
time series
Constructing a backward (P)DE which effectively gives us the gradients of the forward (P)DE. A trick in automatic differentiation which happens to be useful in differentiating likelihood (or other functions) of time-evolving systems. This is an active area of research (Kidger, Chen, and Lyons 2021; Kidger et al. 2020; Li et al. 2020; Rackauckas et al. 2018; Stapor, Fröhlich, and Hasenauer 2018; Cao et al. 2003), but also old and well-studied [Errico (1997);
- versus autodiff: There and Back Again: A Tale of Slopes and Expectations | Mathematics for Machine Learning
1 References
Cao, Li, Petzold, et al. 2003. “Adjoint Sensitivity Analysis for Differential-Algebraic Equations: The Adjoint DAE System and Its Numerical Solution.” SIAM Journal on Scientific Computing.
Carpenter, Hoffman, Brubaker, et al. 2015. “The Stan Math Library: Reverse-Mode Automatic Differentiation in C++.” arXiv Preprint arXiv:1509.07164.
Errico. 1997. “What Is an Adjoint Model?” Bulletin of the American Meteorological Society.
Gahungu, Lanyon, Álvarez, et al. 2022. “Adjoint-Aided Inference of Gaussian Process Driven Differential Equations.” In.
Giles. 2008. “Collected Matrix Derivative Results for Forward and Reverse Mode Algorithmic Differentiation.” In Advances in Automatic Differentiation.
Innes. 2018. “Don’t Unroll Adjoint: Differentiating SSA-Form Programs.” arXiv:1810.07951 [Cs].
Ionescu, Vantzos, and Sminchisescu. 2016. “Training Deep Networks with Structured Layers by Matrix Backpropagation.”
Johnson. 2012. “Notes on Adjoint Methods for 18.335.”
Kavvadias, Papoutsis-Kiachagias, and Giannakoglou. 2015. “On the Proper Treatment of Grid Sensitivities in Continuous Adjoint Methods for Shape Optimization.” Journal of Computational Physics.
Kidger, Chen, and Lyons. 2021. “‘Hey, That’s Not an ODE’: Faster ODE Adjoints via Seminorms.” In Proceedings of the 38th International Conference on Machine Learning.
Kidger, Morrill, Foster, et al. 2020. “Neural Controlled Differential Equations for Irregular Time Series.” arXiv:2005.08926 [Cs, Stat].
Li, Wong, Chen, et al. 2020. “Scalable Gradients for Stochastic Differential Equations.” In International Conference on Artificial Intelligence and Statistics.
Margossian, Vehtari, Simpson, et al. 2020. “Hamiltonian Monte Carlo Using an Adjoint-Differentiated Laplace Approximation: Bayesian Inference for Latent Gaussian Models and Beyond.” arXiv:2004.12550 [Stat].
Mitusch, Funke, and Dokken. 2019. “Dolfin-Adjoint 2018.1: Automated Adjoints for FEniCS and Firedrake.” Journal of Open Source Software.
Papoutsis-Kiachagias, Evangelos. 2013. “Adjoint Methods for Turbulent Flows, Applied to Shape or Topology Optimization and Robust Design.”
Papoutsis-Kiachagias, E. M., and Giannakoglou. 2016. “Continuous Adjoint Methods for Turbulent Flows, Applied to Shape and Topology Optimization: Industrial Applications.” Archives of Computational Methods in Engineering.
Papoutsis-Kiachagias, E. M., Magoulas, Mueller, et al. 2015. “Noise Reduction in Car Aerodynamics Using a Surrogate Objective Function and the Continuous Adjoint Method with Wall Functions.” Computers & Fluids.
Rackauckas, Ma, Dixit, et al. 2018. “A Comparison of Automatic Differentiation and Continuous Sensitivity Analysis for Derivatives of Differential Equation Solutions.” arXiv:1812.01892 [Cs].
Stapor, Fröhlich, and Hasenauer. 2018. “Optimization and Uncertainty Analysis of ODE Models Using 2nd Order Adjoint Sensitivity Analysis.” bioRxiv.