Alexander Norcliffe, Cristian Bodnar, Ben Day, Jacob Moss, and Pietro Liò. 2020.
“Neural ODE Processes.” In.
https://ml4physicalsciences.github.io/2020/files/NeurIPS_ML4PS_2020_66.pdf.
Andersson, Joel A. E., Joris Gillis, Greg Horn, James B. Rawlings, and Moritz Diehl. 2019.
“CasADi: A Software Framework for Nonlinear Optimization and Optimal Control.” Mathematical Programming Computation 11 (1): 1–36.
https://doi.org/10.1007/s12532-018-0139-4.
Anil, Cem, James Lucas, and Roger Grosse. 2018.
“Sorting Out Lipschitz Function Approximation,” November.
https://arxiv.org/abs/1811.05381v1.
Arridge, Simon, Peter Maass, Ozan Öktem, and Carola-Bibiane Schönlieb. 2019.
“Solving Inverse Problems Using Data-Driven Models.” Acta Numerica 28 (May): 1–174.
https://doi.org/10.1017/S0962492919000059.
Babtie, Ann C., Paul Kirk, and Michael P. H. Stumpf. 2014.
“Topological Sensitivity Analysis for Systems Biology.” Proceedings of the National Academy of Sciences 111 (52): 18507–12.
https://doi.org/10.1073/pnas.1414026112.
Bachouch, Achref, Côme Huré, Nicolas Langrené, and Huyen Pham. 2020.
“Deep Neural Networks Algorithms for Stochastic Control Problems on Finite Horizon: Numerical Applications.” January 27, 2020.
http://arxiv.org/abs/1812.05916.
Chang, Bo, Minmin Chen, Eldad Haber, and Ed H. Chi. 2019.
“AntisymmetricRNN: A Dynamical System View on Recurrent Neural Networks.” In
Proceedings of ICLR.
http://arxiv.org/abs/1902.09689.
Chang, Bo, Lili Meng, Eldad Haber, Frederick Tung, and David Begert. 2018.
“Multi-Level Residual Networks from Dynamical Systems View.” In
PRoceedings of ICLR.
http://arxiv.org/abs/1710.10348.
Chen, Tian Qi, and David K Duvenaud. n.d. “Neural Networks with Cheap Differential Operators,” 11.
Chen, Tian Qi, Yulia Rubanova, Jesse Bettencourt, and David K Duvenaud. 2018.
“Neural Ordinary Differential Equations.” In
Advances in Neural Information Processing Systems 31, edited by S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett, 6572–83.
Curran Associates, Inc. http://papers.nips.cc/paper/7892-neural-ordinary-differential-equations.pdf.
Choromanski, Krzysztof, Jared Quincy Davis, Valerii Likhosherstov, Xingyou Song, Jean-Jacques Slotine, Jacob Varley, Honglak Lee, Adrian Weller, and Vikas Sindhwani. 2020.
“An Ode to an ODE.” In
Advances in Neural Information Processing Systems. Vol. 33.
http://arxiv.org/abs/2006.11421.
Course, Kevin, Trefor Evans, and Prasanth Nair. 2020.
“Weak Form Generalized Hamiltonian Learning.” In
Advances in Neural Information Processing Systems. Vol. 33.
https://proceedings.neurips.cc//paper_files/paper/2020/hash/d93c96e6a23fff65b91b900aaa541998-Abstract.html.
Dupont, Emilien, Arnaud Doucet, and Yee Whye Teh. 2019.
“Augmented Neural ODEs.” April 2, 2019.
http://arxiv.org/abs/1904.01681.
E, Weinan. 2017.
“A Proposal on Machine Learning via Dynamical Systems.” Communications in Mathematics and Statistics 5 (1): 1–11.
https://doi.org/10.1007/s40304-017-0103-z.
E, Weinan, Jiequn Han, and Qianxiao Li. 2018.
“A Mean-Field Optimal Control Formulation of Deep Learning.” July 3, 2018.
http://arxiv.org/abs/1807.01083.
Eguchi, Shoichi, and Yuma Uehara. n.d.
“Schwartz-Type Model Selection for Ergodic Stochastic Differential Equation Models.” Scandinavian Journal of Statistics n/a (n/a).
https://doi.org/10.1111/sjos.12474.
Finlay, Chris, Jörn-Henrik Jacobsen, Levon Nurbekyan, and Adam M Oberman. n.d. “How to Train Your Neural ODE: The World of Jacobian and Kinetic Regularization.” In ICML, 14.
Finzi, Marc, Ke Alexander Wang, and Andrew G. Wilson. 2020.
“Simplifying Hamiltonian and Lagrangian Neural Networks via Explicit Constraints.” In
Advances in Neural Information Processing Systems. Vol. 33.
https://papers.nips.cc/paper/2020/hash/9f655cc8884fda7ad6d8a6fb15cc001e-Abstract.html.
Garnelo, Marta, Dan Rosenbaum, Chris J. Maddison, Tiago Ramalho, David Saxton, Murray Shanahan, Yee Whye Teh, Danilo J. Rezende, and S. M. Ali Eslami. 2018.
“Conditional Neural Processes.” July 4, 2018.
https://arxiv.org/abs/1807.01613v1.
Garnelo, Marta, Jonathan Schwarz, Dan Rosenbaum, Fabio Viola, Danilo J. Rezende, S. M. Ali Eslami, and Yee Whye Teh. 2018.
“Neural Processes,” July.
https://arxiv.org/abs/1807.01622v1.
Gholami, Amir, Kurt Keutzer, and George Biros. 2019.
“ANODE: Unconditionally Accurate Memory-Efficient Gradients for Neural ODEs.” February 26, 2019.
http://arxiv.org/abs/1902.10298.
Ghosh, Arnab, Harkirat Behl, Emilien Dupont, Philip Torr, and Vinay Namboodiri. 2020.
“STEER : Simple Temporal Regularization For Neural ODE.” In
Advances in Neural Information Processing Systems. Vol. 33.
https://proceedings.neurips.cc//paper_files/paper/2020/hash/a9e18cb5dd9d3ab420946fa19ebbbf52-Abstract.html.
Gierjatowicz, Patryk, Marc Sabate-Vidales, David Šiška, Lukasz Szpruch, and Žan Žurič. 2020.
“Robust Pricing and Hedging via Neural SDEs.” July 8, 2020.
http://arxiv.org/abs/2007.04154.
Grathwohl, Will, Ricky T. Q. Chen, Jesse Bettencourt, Ilya Sutskever, and David Duvenaud. 2018.
“FFJORD: Free-Form Continuous Dynamics for Scalable Reversible Generative Models.” October 2, 2018.
http://arxiv.org/abs/1810.01367.
Haber, Eldad, Felix Lucka, and Lars Ruthotto. 2018.
“Never Look Back - A Modified EnKF Method and Its Application to the Training of Neural Networks Without Back Propagation.” May 21, 2018.
http://arxiv.org/abs/1805.08034.
Han, Jiequn, Arnulf Jentzen, and Weinan E. 2018.
“Solving High-Dimensional Partial Differential Equations Using Deep Learning.” Proceedings of the National Academy of Sciences 115 (34): 8505–10.
https://doi.org/10.1073/pnas.1718942115.
Haro, A. 2008.
“Automatic Differentiation Methods in Computational Dynamical Systems: Invariant Manifolds and Normal Forms of Vector Fields at Fixed Points.” IMA Note.
http://www.maia.ub.es/~alex/admcds/admcds.pdf.
He, Junxian, Daniel Spokoyny, Graham Neubig, and Taylor Berg-Kirkpatrick. 2019.
“Lagging Inference Networks and Posterior Collapse in Variational Autoencoders.” In
PRoceedings of ICLR.
http://arxiv.org/abs/1901.05534.
Huh, In, Eunho Yang, Sung Ju Hwang, and Jinwoo Shin. 2020.
“Time-Reversal Symmetric ODE Network.” In
Advances in Neural Information Processing Systems. Vol. 33.
https://proceedings.neurips.cc//paper_files/paper/2020/hash/db8419f41d890df802dca330e6284952-Abstract.html.
Huré, Côme, Huyên Pham, Achref Bachouch, and Nicolas Langrené. 2018.
“Deep Neural Networks Algorithms for Stochastic Control Problems on Finite Horizon, Part I: Convergence Analysis.” December 11, 2018.
http://arxiv.org/abs/1812.04300.
Jia, Junteng, and Austin R Benson. 2019.
“Neural Jump Stochastic Differential Equations.” In
Advances in Neural Information Processing Systems 32, edited by H. Wallach, H. Larochelle, A. Beygelzimer, F. d\textquotesingle Alché-Buc, E. Fox, and R. Garnett, 9847–58.
Curran Associates, Inc. http://papers.nips.cc/paper/9177-neural-jump-stochastic-differential-equations.pdf.
Kaul, Shiva. 2020.
“Linear Dynamical Systems as a Core Computational Primitive.” In
Advances in Neural Information Processing Systems. Vol. 33.
https://proceedings.neurips.cc//paper_files/paper/2020/hash/c3581d2150ff68f3b33b22634b8adaea-Abstract.html.
Kelly, Jacob, Jesse Bettencourt, Matthew James Johnson, and David Duvenaud. 2020.
“Learning Differential Equations That Are Easy to Solve.” In.
https://arxiv.org/abs/2007.04504v2.
Kidger, Patrick, Ricky T Q Chen, and Terry Lyons. 2020. “‘Hey, That’s Not an ODE’: Faster ODE Adjoints with 12 Lines of Code.” In, 5.
Kidger, Patrick, James Foster, Xuechen Li, Harald Oberhauser, and Terry Lyons. 2020. “Neural SDEs Made Easy: SDEs Are Infinite-Dimensional GANS.” In Advances In Neural Information Processing Systems, 6.
Kochkov, Dmitrii, Alvaro Sanchez-Gonzalez, Jamie Smith, Tobias Pfaff, Peter Battaglia, and Michael P Brenner. 2020. “Learning Latent Field Dynamics of PDEs.” In, 7.
Kolter, J Zico, and Gaurav Manek. 2019.
“Learning Stable Deep Dynamics Models.” In
Advances in Neural Information Processing Systems, 9.
http://arxiv.org/abs/2001.06116.
Krishnamurthy, Kamesh, Tankut Can, and David J. Schwab. 2020.
“Theory of Gating in Recurrent Neural Networks.” In.
http://arxiv.org/abs/2007.14823.
Lawrence, Nathan, Philip Loewen, Michael Forbes, Johan Backstrom, and Bhushan Gopaluni. 2020.
“Almost Surely Stable Deep Dynamics.” In
Advances in Neural Information Processing Systems. Vol. 33.
https://proceedings.neurips.cc//paper_files/paper/2020/hash/daecf755df5b1d637033bb29b319c39a-Abstract.html.
Li, Xuechen, Ting-Kam Leonard Wong, Ricky T. Q. Chen, and David Duvenaud. 2020.
“Scalable Gradients for Stochastic Differential Equations.” In
International Conference on Artificial Intelligence and Statistics, 3870–82.
PMLR.
http://proceedings.mlr.press/v108/li20i.html.
Lou, Aaron, Derek Lim, Isay Katsman, Leo Huang, Qingxuan Jiang, Ser Nam Lim, and Christopher M. De Sa. 2020.
“Neural Manifold Ordinary Differential Equations.” In
Advances in Neural Information Processing Systems. Vol. 33.
https://papers.nips.cc/paper/2020/hash/cbf8710b43df3f2c1553e649403426df-Abstract.html.
Louizos, Christos, Xiahan Shi, Klamer Schutte, and Max Welling. 2019.
“The Functional Neural Process.” June 19, 2019.
http://arxiv.org/abs/1906.08324.
Lu, Lu, Pengzhan Jin, and George Em Karniadakis. 2020.
“DeepONet: Learning Nonlinear Operators for Identifying Differential Equations Based on the Universal Approximation Theorem of Operators.” April 14, 2020.
http://arxiv.org/abs/1910.03193.
Lu, Yulong, and Jianfeng Lu. 2020.
“A Universal Approximation Theorem of Deep Neural Networks for Expressing Probability Distributions.” In
Advances in Neural Information Processing Systems. Vol. 33.
https://papers.nips.cc/paper/2020/hash/2000f6325dfc4fc3201fc45ed01c7a5d-Abstract.html.
Massaroli, Stefano, Michael Poli, Michelangelo Bin, Jinkyoo Park, Atsushi Yamashita, and Hajime Asama. 2020.
“Stable Neural Flows.” March 18, 2020.
http://arxiv.org/abs/2003.08063.
Massaroli, Stefano, Michael Poli, Jinkyoo Park, Atsushi Yamashita, and Hajime Asama. 2020.
“Dissecting Neural ODEs.” In.
http://arxiv.org/abs/2002.08071.
Mhammedi, Zakaria, Andrew Hellicar, Ashfaqur Rahman, and James Bailey. 2017.
“Efficient Orthogonal Parametrisation of Recurrent Neural Networks Using Householder Reflections.” In
PMLR, 2401–9.
http://proceedings.mlr.press/v70/mhammedi17a.html.
Morrill, James, Patrick Kidger, Cristopher Salvi, James Foster, and Terry Lyons. 2020. “Neural CDEs for Long Time Series via the Log-ODE Method.” In, 5.
Niu, Murphy Yuezhen, Lior Horesh, and Isaac Chuang. 2019.
“Recurrent Neural Networks in the Eye of Differential Equations.” April 29, 2019.
http://arxiv.org/abs/1904.12933.
Palis, J. 1974.
“Vector Fields Generate Few Diffeomorphisms.” Bulletin of the American Mathematical Society 80 (3): 503–5.
https://projecteuclid.org/euclid.bams/1183535529.
Peluchetti, Stefano, and Stefano Favaro. 2019. “Neural SDE - Information Propagation Through the Lens of Diffusion Processes.” In Workshop on Bayesian Deep LEarning, 7.
———. 2020.
“Infinitely Deep Neural Networks as Diffusion Processes.” In
International Conference on Artificial Intelligence and Statistics, 1126–36.
PMLR.
http://proceedings.mlr.press/v108/peluchetti20a.html.
Pfau, David, and Danilo Rezende. 2020. “Integrable Nonparametric Flows.” In, 7.
Poli, Michael, Stefano Massaroli, Atsushi Yamashita, Hajime Asama, and Jinkyoo Park. 2020a.
“Hypersolvers: Toward Fast Continuous-Depth Models.” In
Advances in Neural Information Processing Systems. Vol. 33.
https://proceedings.neurips.cc//paper/2020/hash/f1686b4badcf28d33ed632036c7ab0b8-Abstract.html.
———. 2020b.
“TorchDyn: A Neural Differential Equations Library.” September 19, 2020.
http://arxiv.org/abs/2009.09346.
Rackauckas, Christopher. 2019.
“The Essential Tools of Scientific Machine Learning (Scientific ML).” The Winnower, August.
https://doi.org/10.15200/winn.156631.13064.
Rackauckas, Christopher, Yingbo Ma, Vaibhav Dixit, Xingjian Guo, Mike Innes, Jarrett Revels, Joakim Nyberg, and Vijay Ivaturi. 2018.
“A Comparison of Automatic Differentiation and Continuous Sensitivity Analysis for Derivatives of Differential Equation Solutions.” December 5, 2018.
http://arxiv.org/abs/1812.01892.
Rackauckas, Christopher, Yingbo Ma, Julius Martensen, Collin Warner, Kirill Zubov, Rohit Supekar, Dominic Skinner, and Ali Ramadhan. 2020.
“Universal Differential Equations for Scientific Machine Learning.” January 13, 2020.
https://arxiv.org/abs/2001.04385v1.
Roeder, Geoffrey, Paul K. Grant, Andrew Phillips, Neil Dalchau, and Edward Meeds. 2019.
“Efficient Amortised Bayesian Inference for Hierarchical and Nonlinear Dynamical Systems.” May 28, 2019.
http://arxiv.org/abs/1905.12090.
Ruthotto, Lars, and Eldad Haber. 2018.
“Deep Neural Networks Motivated by Partial Differential Equations.” April 11, 2018.
http://arxiv.org/abs/1804.04272.
Saemundsson, Steindor, Alexander Terenin, Katja Hofmann, and Marc Peter Deisenroth. 2020.
“Variational Integrator Networks for Physically Structured Embeddings.” March 2, 2020.
http://arxiv.org/abs/1910.09349.
Sanchez-Gonzalez, Alvaro, Jonathan Godwin, Tobias Pfaff, Rex Ying, Jure Leskovec, and Peter W. Battaglia. 2020.
“Learning to Simulate Complex Physics with Graph Networks.” In.
http://arxiv.org/abs/2002.09405.
Singh, Gautam, Jaesik Yoon, Youngsung Son, and Sungjin Ahn. 2019.
“Sequential Neural Processes.” June 24, 2019.
http://arxiv.org/abs/1906.10264.
Şimşekli, Umut, Ozan Sener, George Deligiannidis, and Murat A. Erdogdu. 2020.
“Hausdorff Dimension, Stochastic Differential Equations, and Generalization in Neural Networks.” June 16, 2020.
http://arxiv.org/abs/2006.09313.
Tzen, Belinda, and Maxim Raginsky. 2019.
“Neural Stochastic Differential Equations: Deep Latent Gaussian Models in the Diffusion Limit.” October 27, 2019.
http://arxiv.org/abs/1905.09883.
Vorontsov, Eugene, Chiheb Trabelsi, Samuel Kadoury, and Chris Pal. 2017.
“On Orthogonality and Learning Recurrent Networks with Long Term Dependencies.” In
PMLR, 3570–78.
http://proceedings.mlr.press/v70/vorontsov17a.html.
Wang, Chuang, Hong Hu, and Yue M. Lu. 2019.
“A Solvable High-Dimensional Model of GAN.” October 28, 2019.
http://arxiv.org/abs/1805.08349.
Wang, Sifan, Xinling Yu, and Paris Perdikaris. 2020.
“When and Why PINNs Fail to Train: A Neural Tangent Kernel Perspective,” July.
https://arxiv.org/abs/2007.14527v1.
Yang, Liu, Dongkun Zhang, and George Em Karniadakis. 2020.
“Physics-Informed Generative Adversarial Networks for Stochastic Differential Equations.” SIAM Journal on Scientific Computing 42 (1): A292–317.
https://doi.org/10.1137/18M1225409.
Yıldız, Çağatay, Markus Heinonen, and Harri Lähdesmäki. 2019.
“ODE$2̂$VAE: Deep Generative Second Order ODEs with Bayesian Neural Networks.” October 24, 2019.
http://arxiv.org/abs/1905.10994.
Zammit-Mangion, Andrew, and Christopher K. Wikle. 2020.
“Deep Integro-Difference Equation Models for Spatio-Temporal Forecasting.” Spatial Statistics 37 (June): 100408.
https://doi.org/10.1016/j.spasta.2020.100408.
Zhang, Han, Xi Gao, Jacob Unterman, and Tom Arodz. 2020.
“Approximation Capabilities of Neural ODEs and Invertible Residual Networks.” February 29, 2020.
http://arxiv.org/abs/1907.12998.
No comments yet!