Distributional robustness in inference
July 12, 2019 — August 30, 2022
adversarial
AI
functional analysis
game theory
learning
metrics
optimization
probability
statistics
Placeholder, for inference which is robust to a mis=specification of distribution up to some ball in some probaility metric. I saw Jose Blanchet present on Wasserstein distributional robustness at MCM 2019. Bookmarked for later.
To understand: Connections to causal inference, differential privacy, adversarial learning, hedging in portfolios…
Also to understand: How old is distributional robustness? I feel like when I discovered this literature it was already old, but no-one seems to agree upon a specific foundational publication.
1 References
Blanchet, Chen, and Zhou. 2018. “Distributionally Robust Mean-Variance Portfolio Selection with Wasserstein Distances.” arXiv:1802.04885 [Stat].
Blanchet, Kang, and Murthy. 2016. “Robust Wasserstein Profile Inference and Applications to Machine Learning.” arXiv:1610.05627 [Math, Stat].
Blanchet, Kang, Zhang, et al. 2017. “Data-Driven Optimal Cost Selection for Distributionally Robust Optimization.” arXiv:1705.07152 [Stat].
Blanchet, Murthy, and Si. 2019. “Confidence Regions in Wasserstein Distributionally Robust Estimation.” arXiv:1906.01614 [Math, Stat].
Blanchet, Murthy, and Zhang. 2018. “Optimal Transport Based Distributionally Robust Optimization: Structural Properties and Iterative Schemes.” arXiv:1810.02403 [Math].
Cisneros-Velarde, Petersen, and Oh. 2020. “Distributionally Robust Formulation and Model Selection for the Graphical Lasso.” In International Conference on Artificial Intelligence and Statistics.
Diakonikolas, Kamath, Kane, et al. 2017. “Being Robust (in High Dimensions) Can Be Practical.” arXiv:1703.00893 [Cs, Math, Stat].
Farokhi. 2020. “Distributionally-Robust Machine Learning Using Locally Differentially-Private Data.” arXiv:2006.13488 [Cs, Math, Stat].
Gao, and Kleywegt. 2022. “Distributionally Robust Stochastic Optimization with Wasserstein Distance.”
Go, and Isaac. n.d. “Robust Expected Information Gain for Optimal Bayesian Experimental Design Using Ambiguity Sets.”
Husain. 2020. “Distributional Robustness with IPMs and Links to Regularization and GANs.” arXiv:2006.04349 [Cs, Stat].
Mahdian, Blanchet, and Glynn. 2019. “Optimal Transport Relaxations with Application to Wasserstein GANs.” arXiv:1906.03317 [Cs, Math, Stat].
Meinshausen. 2018. “Causality from a Distributional Robustness Point of View.” In 2018 IEEE Data Science Workshop (DSW).
Mohajerin Esfahani, and Kuhn. 2018. “Data-Driven Distributionally Robust Optimization Using the Wasserstein Metric: Performance Guarantees and Tractable Reformulations.” Mathematical Programming.
Sadeghi, Wang, Ma, et al. 2020. “Learning While Respecting Privacy and Robustness to Distributional Uncertainties and Adversarial Data.” arXiv:2007.03724 [Cs, Eess, Math].
Shapiro. 2017. “Distributionally Robust Stochastic Programming.” SIAM Journal on Optimization.
Shapiro, Zhou, and Lin. 2021. “Bayesian Distributionally Robust Optimization.”
Weichwald, and Peters. 2020. “Causality in Cognitive Neuroscience: Concepts, Challenges, and Distributional Robustness.” arXiv:2002.06060 [q-Bio, Stat].