Anderson, Brian D. O. 1982.
“Reverse-Time Diffusion Equation Models.” Stochastic Processes and Their Applications 12 (3): 313–26.
Batz, Philipp, Andreas Ruttor, and Manfred Opper. 2017.
“Approximate Bayes Learning of Stochastic Differential Equations.” arXiv:1702.05390 [Physics, Stat], February.
Baydin, Atilim Gunes, and Barak A. Pearlmutter. 2014.
“Automatic Differentiation of Algorithms for Machine Learning.” arXiv:1404.7456 [Cs, Stat], April.
Chang, Bo, Minmin Chen, Eldad Haber, and Ed H. Chi. 2019.
“AntisymmetricRNN: A Dynamical System View on Recurrent Neural Networks.” In
Proceedings of ICLR.
Chen, Tian Qi, Yulia Rubanova, Jesse Bettencourt, and David K Duvenaud. 2018.
“Neural Ordinary Differential Equations.” In
Advances in Neural Information Processing Systems 31, edited by S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett, 6572–83. Curran Associates, Inc.
Choromanski, Krzysztof, Jared Quincy Davis, Valerii Likhosherstov, Xingyou Song, Jean-Jacques Slotine, Jacob Varley, Honglak Lee, Adrian Weller, and Vikas Sindhwani. 2020.
“An Ode to an ODE.” In
Advances in Neural Information Processing Systems. Vol. 33.
Dandekar, Raj, Karen Chung, Vaibhav Dixit, Mohamed Tarek, Aslan Garcia-Valadez, Krishna Vishal Vemula, and Chris Rackauckas. 2021.
“Bayesian Neural Ordinary Differential Equations.” arXiv:2012.07244 [Cs], March.
Delft, Anne van, and Michael Eichler. 2016.
“Locally Stationary Functional Time Series.” arXiv:1602.05125 [Math, Stat], February.
Errico, Ronald M. 1997.
“What Is an Adjoint Model?” Bulletin of the American Meteorological Society 78 (11): 2577–92.
Gholami, Amir, Kurt Keutzer, and George Biros. 2019.
“ANODE: Unconditionally Accurate Memory-Efficient Gradients for Neural ODEs.” arXiv:1902.10298 [Cs], February.
Gierjatowicz, Patryk, Marc Sabate-Vidales, David Šiška, Lukasz Szpruch, and Žan Žurič. 2020.
“Robust Pricing and Hedging via Neural SDEs.” arXiv:2007.04154 [Cs, q-Fin, Stat], July.
Grathwohl, Will, Ricky T. Q. Chen, Jesse Bettencourt, Ilya Sutskever, and David Duvenaud. 2018.
“FFJORD: Free-Form Continuous Dynamics for Scalable Reversible Generative Models.” arXiv:1810.01367 [Cs, Stat], October.
Gu, Albert, Isys Johnson, Karan Goel, Khaled Saab, Tri Dao, Atri Rudra, and Christopher Ré. 2021.
“Combining Recurrent, Convolutional, and Continuous-Time Models with Linear State Space Layers.” In
Advances in Neural Information Processing Systems, 34:572–85. Curran Associates, Inc.
Hirsh, Seth M., David A. Barajas-Solano, and J. Nathan Kutz. 2022.
“Sparsifying Priors for Bayesian Uncertainty Quantification in Model Discovery.” Royal Society Open Science 9 (2): 211823.
Jia, Junteng, and Austin R Benson. 2019.
“Neural Jump Stochastic Differential Equations.” In
Advances in Neural Information Processing Systems 32, edited by H. Wallach, H. Larochelle, A. Beygelzimer, F. d Alché-Buc, E. Fox, and R. Garnett, 9847–58. Curran Associates, Inc.
Kelly, Jacob, Jesse Bettencourt, Matthew James Johnson, and David Duvenaud. 2020.
“Learning Differential Equations That Are Easy to Solve.” In.
Kidger, Patrick, James Morrill, James Foster, and Terry Lyons. 2020.
“Neural Controlled Differential Equations for Irregular Time Series.” arXiv:2005.08926 [Cs, Stat], November.
Li, Xuechen, Ting-Kam Leonard Wong, Ricky T. Q. Chen, and David Duvenaud. 2020.
“Scalable Gradients for Stochastic Differential Equations.” In
International Conference on Artificial Intelligence and Statistics, 3870–82. PMLR.
Li, Yuhong, Tianle Cai, Yi Zhang, Deming Chen, and Debadeepta Dey. 2022.
“What Makes Convolutional Models Great on Long Sequence Modeling?” arXiv.
Ljung, Lennart. 2010.
“Perspectives on System Identification.” Annual Reviews in Control 34 (1): 1–12.
Lu, Peter Y., Joan Ariño, and Marin Soljačić. 2021.
“Discovering Sparse Interpretable Dynamics from Partial Observations.” arXiv:2107.10879 [Physics], July.
Massaroli, Stefano, Michael Poli, Jinkyoo Park, Atsushi Yamashita, and Hajime Asama. 2020.
“Dissecting Neural ODEs.” In
arXiv:2002.08071 [Cs, Stat].
Morrill, James, Patrick Kidger, Cristopher Salvi, James Foster, and Terry Lyons. 2020. “Neural CDEs for Long Time Series via the Log-ODE Method.” In, 5.
Nabian, Mohammad Amin, and Hadi Meidani. 2019.
“A Deep Learning Solution Approach for High-Dimensional Random Differential Equations.” Probabilistic Engineering Mechanics 57 (July): 14–25.
Nguyen, Long, and Andy Malinsky. 2020. “Exploration and Implementation of Neural Ordinary Differential Equations,” 34.
Pham, Tung, and Victor Panaretos. 2016.
“Methodology and Convergence Rates for Functional Time Series Regression.” arXiv:1612.07197 [Math, Stat], December.
Pillonetto, Gianluigi. 2016.
“The Interplay Between System Identification and Machine Learning.” arXiv:1612.09158 [Cs, Stat], December.
Rackauckas, Christopher, Yingbo Ma, Vaibhav Dixit, Xingjian Guo, Mike Innes, Jarrett Revels, Joakim Nyberg, and Vijay Ivaturi. 2018.
“A Comparison of Automatic Differentiation and Continuous Sensitivity Analysis for Derivatives of Differential Equation Solutions.” arXiv:1812.01892 [Cs], December.
Rackauckas, Christopher, Yingbo Ma, Julius Martensen, Collin Warner, Kirill Zubov, Rohit Supekar, Dominic Skinner, Ali Ramadhan, and Alan Edelman. 2020.
“Universal Differential Equations for Scientific Machine Learning.” arXiv:2001.04385 [Cs, Math, q-Bio, Stat], August.
Ramsundar, Bharath, Dilip Krishnamurthy, and Venkatasubramanian Viswanathan. 2021.
“Differentiable Physics: A Position Piece.” arXiv:2109.07573 [Physics], September.
Särkkä, Simo. 2011.
“Linear Operators and Stochastic Partial Differential Equations in Gaussian Process Regression.” In
Artificial Neural Networks and Machine Learning – ICANN 2011, edited by Timo Honkela, Włodzisław Duch, Mark Girolami, and Samuel Kaski, 6792:151–58. Lecture Notes in Computer Science. Berlin, Heidelberg: Springer.
Särkkä, Simo, and Arno Solin. 2019.
Applied Stochastic Differential Equations. Institute of Mathematical Statistics Textbooks 10. Cambridge ; New York, NY: Cambridge University Press.
Schmidt, Jonathan, Nicholas Krämer, and Philipp Hennig. 2021.
“A Probabilistic State Space Model for Joint Inference from Differential Equations and Data.” arXiv:2103.10153 [Cs, Stat], June.
Song, Yang, Conor Durkan, Iain Murray, and Stefano Ermon. 2021.
“Maximum Likelihood Training of Score-Based Diffusion Models.” In
Advances in Neural Information Processing Systems.
Song, Yang, Jascha Sohl-Dickstein, Diederik P. Kingma, Abhishek Kumar, Stefano Ermon, and Ben Poole. 2022.
“Score-Based Generative Modeling Through Stochastic Differential Equations.” In.
Um, Kiwon, Robert Brand, Yun Fei, Philipp Holl, and Nils Thuerey. 2021.
“Solver-in-the-Loop: Learning from Differentiable Physics to Interact with Iterative PDE-Solvers.” arXiv:2007.00016 [Physics], January.
Um, Kiwon, and Philipp Holl. 2021. “Differentiable Physics for Improving the Accuracy of Iterative PDE-Solvers with Neural Networks.” In, 5.
Unser, Michael A., and Pouya Tafti. 2014.
An Introduction to Sparse Stochastic Processes. New York: Cambridge University Press.
Wedig, W. 1984.
“A Critical Review of Methods in Stochastic Structural Dynamics.” Nuclear Engineering and Design 79 (3): 281–87.
Yıldız, Çağatay, Markus Heinonen, and Harri Lähdesmäki. 2019.
“ODE\(^2\)VAE: Deep Generative Second Order ODEs with Bayesian Neural Networks.” arXiv:1905.10994 [Cs, Stat], October.
Yoshida, Nakahiro. 2022.
“Quasi-Likelihood Analysis and Its Applications.” Statistical Inference for Stochastic Processes 25 (1): 43–60.
Zhang, Dongkun, Ling Guo, and George Em Karniadakis. 2020.
“Learning in Modal Space: Solving Time-Dependent Stochastic PDEs Using Physics-Informed Neural Networks.” SIAM Journal on Scientific Computing 42 (2): A639–65.
No comments yet. Why not leave one?