Garbled highlights from NeurIPS 2020
September 17, 2020 — December 11, 2020
neural nets
statistics
1 workshops
- Machine Learning for Creativity and Design
- Workshop on Deep Learning and Inverse Problems
- Differentiable vision, graphics, and physics applied to machine learning
- Learning Meaningful Representations of Life
- Tackling Climate Change with Machine Learning
- AI for Earth Sciences
- Causal Discovery & Causality-Inspired Machine Learning
- Interpretable Inductive Biases and Physically Structured Learning
2 interesting papers by ad hoc theme
2.1 causality
- Causal Imitation Learning With Unobserved Confounders
- Causal Learning
- Domain Adaptation as a Problem of Inference on Graphical Models
- Sense and Sensitivity Analysis: Simple Post-Hoc Analysis of Bias Due to Unobserved Confounding
- Generalized Independent Noise Condition for Estimating Latent Variable Causal Graphs
- Differentiable Causal Discovery from Interventional Data
2.2 learning in continuous time/depth
- Almost Surely Stable Deep Dynamics
- Learning Differential Equations that are Easy to Solve
- Dissecting Neural ODEs
- STEER: Simple Temporal Regularization for Neural ODE
- Training Generative Adversarial Networks by Solving Ordinary Differential Equations
- Ode to an ODE
- Time-Reversal Symmetric ODE Network
- Hypersolvers: Toward Fast Continuous-Depth Models
- On Second Order Behaviour in Augmented Neural ODEs
- Neural Controlled Differential Equations for Irregular Time Series
3 Learning with weird losses
4 ML physical sciences
5 References
Bai, Koltun, and Kolter. 2020. “Multiscale Deep Equilibrium Models.” In Advances in Neural Information Processing Systems.
Bolte, and Pauwels. 2020. “A Mathematical Model for Automatic Differentiation in Machine Learning.” In Advances in Neural Information Processing Systems.
Chi, Jiang, and Mu. 2020. “Fast Fourier Convolution.” In Advances in Neural Information Processing Systems.
Choromanski, Davis, Likhosherstov, et al. 2020. “An Ode to an ODE.” In Advances in Neural Information Processing Systems.
Course, Evans, and Nair. 2020. “Weak Form Generalized Hamiltonian Learning.” In Advances in Neural Information Processing Systems.
Dorado-Rojas, Vinzamuri, and Vanfretti. 2020. “Orthogonal Laguerre Recurrent Neural Networks.” In.
Fan, and Wang. 2020. “Spectra of the Conjugate Kernel and Neural Tangent Kernel for Linear-Width Neural Networks.” In Advances in Neural Information Processing Systems.
Finzi, Wang, and Wilson. 2020. “Simplifying Hamiltonian and Lagrangian Neural Networks via Explicit Constraints.” In Advances in Neural Information Processing Systems.
Fort, Dziugaite, Paul, et al. 2020. “Deep Learning Versus Kernel Learning: An Empirical Study of Loss Landscape Geometry and the Time Evolution of the Neural Tangent Kernel.” In Advances in Neural Information Processing Systems.
Gardner. 1988. Statistical Spectral Analysis: A Non-Probabilistic Theory.
Geifman, Yadav, Kasten, et al. 2020. “On the Similarity Between the Laplace and Neural Tangent Kernels.” In arXiv:2007.01580 [Cs, Stat].
Ghosh, Behl, Dupont, et al. 2020. “STEER : Simple Temporal Regularization For Neural ODE.” In Advances in Neural Information Processing Systems.
He, Lakshminarayanan, and Teh. 2020. “Bayesian Deep Ensembles via the Neural Tangent Kernel.” In Advances in Neural Information Processing Systems.
Hortúa, Volpi, Marinelli, et al. 2020. “Accelerating MCMC Algorithms Through Bayesian Deep Networks.” In.
Huh, Yang, Hwang, et al. 2020. “Time-Reversal Symmetric ODE Network.” In Advances in Neural Information Processing Systems.
Karimi, Barthe, Schölkopf, et al. 2021. “A Survey of Algorithmic Recourse: Definitions, Formulations, Solutions, and Prospects.”
Kaul. 2020. “Linear Dynamical Systems as a Core Computational Primitive.” In Advances in Neural Information Processing Systems.
Kelly, Bettencourt, Johnson, et al. 2020. “Learning Differential Equations That Are Easy to Solve.” In.
Kidger, Chen, and Lyons. 2021. “‘Hey, That’s Not an ODE’: Faster ODE Adjoints via Seminorms.” In Proceedings of the 38th International Conference on Machine Learning.
Kochkov, Sanchez-Gonzalez, Smith, et al. 2020. “Learning Latent FIeld Dynamics of PDEs.” In Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS).
Kothari, de Hoop, and Dokmanić. 2020. “Learning the Geometry of Wave-Based Imaging.” In Advances in Neural Information Processing Systems.
Krämer, Köhler, and Noé. n.d. “Preserving Properties of Neural Networks by Perturbative Updates.” In.
Krishnamurthy, Can, and Schwab. 2022. “Theory of Gating in Recurrent Neural Networks.” Physical Review. X.
Lawrence, Loewen, Forbes, et al. 2020. “Almost Surely Stable Deep Dynamics.” In Advances in Neural Information Processing Systems.
Liu, and Scarlett. 2020. “The Generalized Lasso with Nonlinear Observations and Generative Priors.” In Advances in Neural Information Processing Systems.
Lou, Lim, Katsman, et al. 2020. “Neural Manifold Ordinary Differential Equations.” In Advances in Neural Information Processing Systems.
Lu, You, and Huang. 2020. “Woodbury Transformations for Deep Generative Flows.” In Advances in Neural Information Processing Systems.
Lu, Yulong, and Lu. 2020. “A Universal Approximation Theorem of Deep Neural Networks for Expressing Probability Distributions.” In Advances in Neural Information Processing Systems.
Massaroli, Poli, Park, et al. 2020. “Dissecting Neural ODEs.” In arXiv:2002.08071 [Cs, Stat].
Meronen, Irwanto, and Solin. 2020. “Stationary Activations for Uncertainty Calibration in Deep Learning.” In Advances in Neural Information Processing Systems.
Mhammedi, Foster, Simchowitz, et al. 2020. “Learning the Linear Quadratic Regulator from Nonlinear Observations.” In Advances in Neural Information Processing Systems.
Miller, Cole, Louppe, et al. 2020. “Simulation-Efficient Marginal Posterior Estimation with Swyft: Stop Wasting Your Precious Time.” In.
Morrill, Kidger, Salvi, et al. 2020. “Neural CDEs for Long Time Series via the Log-ODE Method.” In.
Norcliffe, Bodnar, Day, et al. 2020. “Neural ODE Processes.” In.
Pfau, and Rezende. 2020. “Integrable Nonparametric Flows.” In.
Poli, Massaroli, Yamashita, et al. 2020. “Hypersolvers: Toward Fast Continuous-Depth Models.” In Advances in Neural Information Processing Systems.
Priestley. 2004. Spectral analysis and time series. Probability and mathematical statistics.
Qin, Wu, Springenberg, et al. 2020. “Training Generative Adversarial Networks by Solving Ordinary Differential Equations.” In Advances in Neural Information Processing Systems.
Rashidinejad, Jiao, and Russell. 2020. “SLIP: Learning to Predict in Unknown Dynamical Systems with Long-Term Memory.” In Advances in Neural Information Processing Systems.
Rojas-Gómez, Yang, Lin, et al. 2020. “Physics-Consistent Data-Driven Seismic Inversion with Adaptive Data Augmentation.” In.
Saha, and Balamurugan. 2020. “Learning with Operator-Valued Kernels in Reproducing Kernel Krein Spaces.” In Advances in Neural Information Processing Systems.
Salim, Korba, and Luise. 2020. “The Wasserstein Proximal Gradient Algorithm.” In Advances in Neural Information Processing Systems.
Shen, Wang, Ribeiro, et al. 2020. “Sinkhorn Natural Gradient for Generative Models.” In Advances in Neural Information Processing Systems.
Shukla, and Marlin. n.d. “A Survey on Principles, Models and Methods for Learning from Irregularly Sampled Time Series: From Discretization to Attention and Invariance.” In.
Um, and Holl. 2021. “Differentiable Physics for Improving the Accuracy of Iterative PDE-Solvers with Neural Networks.” In.
Vahdat, and Kautz. 2020. “NVAE: A Deep Hierarchical Variational Autoencoder.” In Advances in Neural Information Processing Systems.
van Gelder, Wortsman, and Ehsani. 2020. “Deconstructing the Structure of Sparse Neural Networks.” In.
Verma, Dickerson, and Hines. 2020. “Counterfactual Explanations for Machine Learning: A Review.” In.
Walder, and Nock. 2020. “All Your Loss Are Belong to Bayes.” In Advances in Neural Information Processing Systems.
Wang, Bentivegna, Zhou, et al. 2020. “Physics-Informed Neural Network Super Resolution for Advection-Diffusion Models.” In.
Xu, and Darve. 2020. “ADCME: Learning Spatially-Varying Physical Fields Using Deep Neural Networks.” In arXiv:2011.11955 [Cs, Math].
Zhang, Kun, Gong, Stojanov, et al. 2020. “Domain Adaptation as a Problem of Inference on Graphical Models.” In Advances in Neural Information Processing Systems.
Zhang, Rui, Walder, Bonilla, et al. 2020. “Quantile Propagation for Wasserstein-Approximate Gaussian Processes.” In Proceedings of NeurIPS 2020.