Andersson, Gillis, Horn, et al. 2019.
“CasADi: A Software Framework for Nonlinear Optimization and Optimal Control.” Mathematical Programming Computation.
Arridge, Maass, Öktem, et al. 2019.
“Solving Inverse Problems Using Data-Driven Models.” Acta Numerica.
Babtie, Kirk, and Stumpf. 2014.
“Topological Sensitivity Analysis for Systems Biology.” Proceedings of the National Academy of Sciences.
Chang, Meng, Haber, et al. 2018.
“Multi-Level Residual Networks from Dynamical Systems View.” In
PRoceedings of ICLR.
Chen, Ricky T. Q., and Duvenaud. 2019.
“Neural Networks with Cheap Differential Operators.” In
Advances in Neural Information Processing Systems.
Chen, Boyuan, Huang, Raghupathi, et al. 2022.
“Automated Discovery of Fundamental Variables Hidden in Experimental Data.” Nature Computational Science.
Chen, Tian Qi, Rubanova, Bettencourt, et al. 2018.
“Neural Ordinary Differential Equations.” In
Advances in Neural Information Processing Systems 31.
Choromanski, Davis, Likhosherstov, et al. 2020.
“An Ode to an ODE.” In
Advances in Neural Information Processing Systems.
Corenflos, Thornton, Deligiannidis, et al. 2021.
“Differentiable Particle Filtering via Entropy-Regularized Optimal Transport.” arXiv:2102.07850 [Cs, Stat].
Course, Evans, and Nair. 2020.
“Weak Form Generalized Hamiltonian Learning.” In
Advances in Neural Information Processing Systems.
de Brouwer, Simm, Arany, et al. 2019.
“GRU-ODE-Bayes: Continuous Modeling of Sporadically-Observed Time Series.” In
Advances in Neural Information Processing Systems.
Dupont, Doucet, and Teh. 2019.
“Augmented Neural ODEs.” arXiv:1904.01681 [Cs, Stat].
E. 2017.
“A Proposal on Machine Learning via Dynamical Systems.” Communications in Mathematics and Statistics.
———. 2021.
“The Dawning of a New Era in Applied Mathematics.” Notices of the American Mathematical Society.
Finlay, Jacobsen, Nurbekyan, et al. n.d. “How to Train Your Neural ODE: The World of Jacobian and Kinetic Regularization.” In ICML.
Finzi, Wang, and Wilson. 2020.
“Simplifying Hamiltonian and Lagrangian Neural Networks via Explicit Constraints.” In
Advances in Neural Information Processing Systems.
Garnelo, Rosenbaum, Maddison, et al. 2018.
“Conditional Neural Processes.” arXiv:1807.01613 [Cs, Stat].
Garnelo, Schwarz, Rosenbaum, et al. 2018.
“Neural Processes.”
Ghosh, Behl, Dupont, et al. 2020.
“STEER : Simple Temporal Regularization For Neural ODE.” In
Advances in Neural Information Processing Systems.
Gierjatowicz, Sabate-Vidales, Šiška, et al. 2020.
“Robust Pricing and Hedging via Neural SDEs.” arXiv:2007.04154 [Cs, q-Fin, Stat].
Han, Jentzen, and E. 2018.
“Solving High-Dimensional Partial Differential Equations Using Deep Learning.” Proceedings of the National Academy of Sciences.
Hasani, Lechner, Amini, et al. 2020.
“Liquid Time-Constant Networks.” arXiv:2006.04439 [Cs, Stat].
Holzschuh, Vegetti, and Thuerey. 2022. “Score Matching via Differentiable Physics.”
Huh, Yang, Hwang, et al. 2020.
“Time-Reversal Symmetric ODE Network.” In
Advances in Neural Information Processing Systems.
Jia, and Benson. 2019.
“Neural Jump Stochastic Differential Equations.” In
Advances in Neural Information Processing Systems 32.
Kaul. 2020.
“Linear Dynamical Systems as a Core Computational Primitive.” In
Advances in Neural Information Processing Systems.
Kidger, Chen, and Lyons. 2021.
“‘Hey, That’s Not an ODE’: Faster ODE Adjoints via Seminorms.” In
Proceedings of the 38th International Conference on Machine Learning.
Kidger, Foster, Li, et al. 2021.
“Neural SDEs as Infinite-Dimensional GANs.” In
Proceedings of the 38th International Conference on Machine Learning.
Kidger, Morrill, Foster, et al. 2020.
“Neural Controlled Differential Equations for Irregular Time Series.” arXiv:2005.08926 [Cs, Stat].
Kochkov, Sanchez-Gonzalez, Smith, et al. 2020. “Learning Latent FIeld Dynamics of PDEs.” In Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS).
Kolter, and Manek. 2019.
“Learning Stable Deep Dynamics Models.” In
Advances in Neural Information Processing Systems.
Krishnamurthy, Can, and Schwab. 2022.
“Theory of Gating in Recurrent Neural Networks.” Physical Review. X.
Laurent, and von Brecht. 2016.
“A Recurrent Neural Network Without Chaos.” arXiv:1612.06212 [Cs].
Lawrence, Loewen, Forbes, et al. 2020.
“Almost Surely Stable Deep Dynamics.” In
Advances in Neural Information Processing Systems.
Li, Xuechen, Wong, Chen, et al. 2020.
“Scalable Gradients for Stochastic Differential Equations.” In
International Conference on Artificial Intelligence and Statistics.
Louizos, Shi, Schutte, et al. 2019.
“The Functional Neural Process.” In
Advances in Neural Information Processing Systems.
Lou, Lim, Katsman, et al. 2020.
“Neural Manifold Ordinary Differential Equations.” In
Advances in Neural Information Processing Systems.
Massaroli, Poli, Bin, et al. 2020.
“Stable Neural Flows.” arXiv:2003.08063 [Cs, Math, Stat].
Massaroli, Poli, Park, et al. 2020.
“Dissecting Neural ODEs.” In
arXiv:2002.08071 [Cs, Stat].
Morrill, Kidger, Salvi, et al. 2020. “Neural CDEs for Long Time Series via the Log-ODE Method.” In.
Nguyen, and Malinsky. 2020. “Exploration and Implementation of Neural Ordinary Differential Equations.”
Niu, Horesh, and Chuang. 2019.
“Recurrent Neural Networks in the Eye of Differential Equations.” arXiv:1904.12933 [Quant-Ph, Stat].
Norcliffe, Bodnar, Day, et al. 2020.
“Neural ODE Processes.” In.
Palis. 1974.
“Vector Fields Generate Few Diffeomorphisms.” Bulletin of the American Mathematical Society.
Peluchetti, and Favaro. 2019. “Neural SDE - Information Propagation Through the Lens of Diffusion Processes.” In Workshop on Bayesian Deep LEarning.
———. 2020.
“Infinitely Deep Neural Networks as Diffusion Processes.” In
International Conference on Artificial Intelligence and Statistics.
Pfau, and Rezende. 2020. “Integrable Nonparametric Flows.” In.
Poli, Massaroli, Yamashita, et al. 2020a.
“Hypersolvers: Toward Fast Continuous-Depth Models.” In
Advances in Neural Information Processing Systems.
Revach, Shlezinger, van Sloun, et al. 2021.
“Kalmannet: Data-Driven Kalman Filtering.” In
ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
Ruthotto, and Haber. 2020.
“Deep Neural Networks Motivated by Partial Differential Equations.” Journal of Mathematical Imaging and Vision.
Saemundsson, Terenin, Hofmann, et al. 2020.
“Variational Integrator Networks for Physically Structured Embeddings.” arXiv:1910.09349 [Cs, Stat].
Sanchez-Gonzalez, Godwin, Pfaff, et al. 2020.
“Learning to Simulate Complex Physics with Graph Networks.” In
Proceedings of the 37th International Conference on Machine Learning.
Shlezinger, Whang, Eldar, et al. 2020.
“Model-Based Deep Learning.” arXiv:2012.08405 [Cs, Eess].
Singh, Yoon, Son, et al. 2019.
“Sequential Neural Processes.” arXiv:1906.10264 [Cs, Stat].
Thuerey, Holl, Mueller, et al. 2021.
Physics-Based Deep Learning.
Tzen, and Raginsky. 2019a.
“Theoretical Guarantees for Sampling and Inference in Generative Models with Latent Diffusions.” In
Proceedings of the Thirty-Second Conference on Learning Theory.
Wang, Chuang, Hu, and Lu. 2019.
“A Solvable High-Dimensional Model of GAN.” arXiv:1805.08349 [Cond-Mat, Stat].