Garbled highlights from NeurIPS 2020



ML physical sciences

References

Bai, Shaojie, Vladlen Koltun, and J. Zico Kolter. 2020. Multiscale Deep Equilibrium Models.” In Advances in Neural Information Processing Systems. Vol. 33.
Bolte, Jérôme, and Edouard Pauwels. 2020. A Mathematical Model for Automatic Differentiation in Machine Learning.” In Advances in Neural Information Processing Systems. Vol. 33.
Chi, Lu, Borui Jiang, and Yadong Mu. 2020. Fast Fourier Convolution.” In Advances in Neural Information Processing Systems. Vol. 33.
Choromanski, Krzysztof, Jared Quincy Davis, Valerii Likhosherstov, Xingyou Song, Jean-Jacques Slotine, Jacob Varley, Honglak Lee, Adrian Weller, and Vikas Sindhwani. 2020. An Ode to an ODE.” In Advances in Neural Information Processing Systems. Vol. 33.
Course, Kevin, Trefor Evans, and Prasanth Nair. 2020. Weak Form Generalized Hamiltonian Learning.” In Advances in Neural Information Processing Systems. Vol. 33.
Dorado-Rojas, Sergio A, Bhanukiran Vinzamuri, and Luigi Vanfretti. 2020. “Orthogonal Laguerre Recurrent Neural Networks.” In, 7.
Fan, Zhou, and Zhichao Wang. 2020. Spectra of the Conjugate Kernel and Neural Tangent Kernel for Linear-Width Neural Networks.” In Advances in Neural Information Processing Systems, 33:12.
Finzi, Marc, Ke Alexander Wang, and Andrew G. Wilson. 2020. Simplifying Hamiltonian and Lagrangian Neural Networks via Explicit Constraints.” In Advances in Neural Information Processing Systems. Vol. 33.
Fort, Stanislav, Gintare Karolina Dziugaite, Mansheej Paul, Sepideh Kharaghani, Daniel M. Roy, and Surya Ganguli. 2020. Deep Learning Versus Kernel Learning: An Empirical Study of Loss Landscape Geometry and the Time Evolution of the Neural Tangent Kernel.” In Advances in Neural Information Processing Systems. Vol. 33.
Gardner, William A. 1988. Statistical Spectral Analysis: A Non-Probabilistic Theory. 1st edition. Englewood Cliffs, N.J: Prentice Hall.
Geifman, Amnon, Abhay Yadav, Yoni Kasten, Meirav Galun, David Jacobs, and Ronen Basri. 2020. On the Similarity Between the Laplace and Neural Tangent Kernels.” In arXiv:2007.01580 [Cs, Stat].
Gelder, Maxwell Van, Mitchell Wortsman, and Kiana Ehsani. n.d. “Deconstructing the Structure of Sparse Neural Networks.” In, 6.
Ghosh, Arnab, Harkirat Behl, Emilien Dupont, Philip Torr, and Vinay Namboodiri. 2020. STEER : Simple Temporal Regularization For Neural ODE.” In Advances in Neural Information Processing Systems. Vol. 33.
He, Bobby, Balaji Lakshminarayanan, and Yee Whye Teh. 2020. Bayesian Deep Ensembles via the Neural Tangent Kernel.” In Advances in Neural Information Processing Systems. Vol. 33.
Hortúa, Héctor Javier, Riccardo Volpi, Dimitri Marinelli, and Luigi Malagò. 2020. “Accelerating MCMC Algorithms Through Bayesian Deep Networks.” In, 6.
Huh, In, Eunho Yang, Sung Ju Hwang, and Jinwoo Shin. 2020. Time-Reversal Symmetric ODE Network.” In Advances in Neural Information Processing Systems. Vol. 33.
Karimi, Amir-Hossein, Gilles Barthe, Bernhard Schölkopf, and Isabel Valera. 2021. A Survey of Algorithmic Recourse: Definitions, Formulations, Solutions, and Prospects.” arXiv.
Kaul, Shiva. 2020. Linear Dynamical Systems as a Core Computational Primitive.” In Advances in Neural Information Processing Systems. Vol. 33.
Kelly, Jacob, Jesse Bettencourt, Matthew James Johnson, and David Duvenaud. 2020. Learning Differential Equations That Are Easy to Solve.” In.
Kidger, Patrick, Ricky T. Q. Chen, and Terry J. Lyons. 2021. ‘Hey, That’s Not an ODE’: Faster ODE Adjoints via Seminorms.” In Proceedings of the 38th International Conference on Machine Learning, 5443–52. PMLR.
Kochkov, Dmitrii, Alvaro Sanchez-Gonzalez, Jamie Smith, Tobias Pfaff, Peter Battaglia, and Michael P Brenner. 2020. “Learning Latent FIeld Dynamics of PDEs.” In Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS), 7.
Kothari, Konik, Maarten de Hoop, and Ivan Dokmanić. 2020. Learning the Geometry of Wave-Based Imaging.” In Advances in Neural Information Processing Systems. Vol. 33.
Krämer, Andreas, Jonas Köhler, and Frank Noé. n.d. “Preserving Properties of Neural Networks by Perturbative Updates.” In, 7.
Krishnamurthy, Kamesh, Tankut Can, and David J. Schwab. 2020. Theory of Gating in Recurrent Neural Networks.” In arXiv:2007.14823 [Cond-Mat, Physics:nlin, q-Bio].
Lawrence, Nathan, Philip Loewen, Michael Forbes, Johan Backstrom, and Bhushan Gopaluni. 2020. Almost Surely Stable Deep Dynamics.” In Advances in Neural Information Processing Systems. Vol. 33.
Liu, Zhaoqiang, and Jonathan Scarlett. 2020. The Generalized Lasso with Nonlinear Observations and Generative Priors.” In Advances in Neural Information Processing Systems. Vol. 33.
Lou, Aaron, Derek Lim, Isay Katsman, Leo Huang, Qingxuan Jiang, Ser Nam Lim, and Christopher M. De Sa. 2020. Neural Manifold Ordinary Differential Equations.” In Advances in Neural Information Processing Systems. Vol. 33.
Lu, You, and Bert Huang. 2020. Woodbury Transformations for Deep Generative Flows.” In Advances in Neural Information Processing Systems. Vol. 33.
Lu, Yulong, and Jianfeng Lu. 2020. A Universal Approximation Theorem of Deep Neural Networks for Expressing Probability Distributions.” In Advances in Neural Information Processing Systems. Vol. 33.
Massaroli, Stefano, Michael Poli, Jinkyoo Park, Atsushi Yamashita, and Hajime Asama. 2020. Dissecting Neural ODEs.” In arXiv:2002.08071 [Cs, Stat].
Meronen, Lassi, Christabella Irwanto, and Arno Solin. 2020. Stationary Activations for Uncertainty Calibration in Deep Learning.” In Advances in Neural Information Processing Systems. Vol. 33.
Mhammedi, Zakaria, Dylan J. Foster, Max Simchowitz, Dipendra Misra, Wen Sun, Akshay Krishnamurthy, Alexander Rakhlin, and John Langford. 2020. Learning the Linear Quadratic Regulator from Nonlinear Observations.” In Advances in Neural Information Processing Systems. Vol. 33.
Miller, Benjamin Kurt, Alex Cole, and Gilles Louppe. n.d. “Simulation-Efficient Marginal Posterior Estimation with Swyft: Stop Wasting Your Precious Time.” In, 9.
Morrill, James, Patrick Kidger, Cristopher Salvi, James Foster, and Terry Lyons. 2020. “Neural CDEs for Long Time Series via the Log-ODE Method.” In, 5.
Norcliffe, Alexander, Cristian Bodnar, Ben Day, Jacob Moss, and Pietro Liò. 2020. Neural ODE Processes.” In.
Pfau, David, and Danilo Rezende. 2020. “Integrable Nonparametric Flows.” In, 7.
Poli, Michael, Stefano Massaroli, Atsushi Yamashita, Hajime Asama, and Jinkyoo Park. 2020. Hypersolvers: Toward Fast Continuous-Depth Models.” In Advances in Neural Information Processing Systems. Vol. 33.
Priestley, M. B. 2004. Spectral analysis and time series. Repr. Probability and mathematical statistics. London: Elsevier.
Qin, Chongli, Yan Wu, Jost Tobias Springenberg, Andy Brock, Jeff Donahue, Timothy Lillicrap, and Pushmeet Kohli. 2020. Training Generative Adversarial Networks by Solving Ordinary Differential Equations.” In Advances in Neural Information Processing Systems. Vol. 33.
Rashidinejad, Paria, Jiantao Jiao, and Stuart Russell. 2020. SLIP: Learning to Predict in Unknown Dynamical Systems with Long-Term Memory.” In Advances in Neural Information Processing Systems. Vol. 33.
Rojas-Gómez, Renán, Jihyun Yang, Youzuo Lin, James Theiler, and Brendt Wohlberg. 2020. “Physics-Consistent Data-Driven Seismic Inversion with Adaptive Data Augmentation.” In, 5.
Saha, Akash, and Palaniappan Balamurugan. 2020. Learning with Operator-Valued Kernels in Reproducing Kernel Krein Spaces.” In Advances in Neural Information Processing Systems. Vol. 33.
Salim, Adil, Anna Korba, and Giulia Luise. 2020. The Wasserstein Proximal Gradient Algorithm.” In Advances in Neural Information Processing Systems. Vol. 33.
Shen, Zebang, Zhenfu Wang, Alejandro Ribeiro, and Hamed Hassani. 2020. Sinkhorn Natural Gradient for Generative Models.” In Advances in Neural Information Processing Systems. Vol. 33.
Shukla, Satya Narayan, and Benjamin M Marlin. n.d. “A Survey on Principles, Models and Methods for Learning from Irregularly Sampled Time Series: From Discretization to Attention and Invariance.” In, 29.
Um, Kiwon, and Philipp Holl. 2021. “Differentiable Physics for Improving the Accuracy of Iterative PDE-Solvers with Neural Networks.” In, 5.
Vahdat, Arash, and Jan Kautz. 2020. NVAE: A Deep Hierarchical Variational Autoencoder.” In Advances in Neural Information Processing Systems. Vol. 33.
Verma, Sahil, John Dickerson, and Keegan Hines. 2020. “Counterfactual Explanations for Machine Learning: A Review.” In, 22.
Walder, Christian, and Richard Nock. 2020. All Your Loss Are Belong to Bayes.” In Advances in Neural Information Processing Systems. Vol. 33.
Wang, Chulin, Eloisa Bentivegna, Wang Zhou, Levente J Klein, and Bruce Elmegreen. 2020. “Physics-Informed Neural Network Super Resolution for Advection-Diffusion Models.” In, 9.
Xu, Kailai, and Eric Darve. 2020. ADCME: Learning Spatially-Varying Physical Fields Using Deep Neural Networks.” In arXiv:2011.11955 [Cs, Math].
Zhang, Kun, Mingming Gong, Petar Stojanov, Biwei Huang, Qingsong Liu, and Clark Glymour. 2020. Domain Adaptation as a Problem of Inference on Graphical Models.” In Advances in Neural Information Processing Systems. Vol. 33.
Zhang, Rui, Christian Walder, Edwin V. Bonilla, Marian-Andrei Rizoiu, and Lexing Xie. 2020. Quantile Propagation for Wasserstein-Approximate Gaussian Processes.” In Proceedings of NeurIPS 2020.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.