# Causal inference in highly parameterized ML

September 18, 2020 — March 7, 2024

Applying a causal graph structure in the challenging environment of a no-holds-barred nonparametric machine learning algorithm such as a neural net or its ilk. I am interested in this because it seems necessary and kind of obvious for handling things like dataset shift, but is often ignored. What is that about?

I do not know at the moment. This is a link salad for now.

See also the brain salad graphical models and supervised models.

## 1 Invariance approaches

Léon Bottou, From Causal Graphs to Causal Invariance:

For many problems, it’s difficult to even attempt drawing a causal graph. While structural causal models provide a complete framework for causal inference, it is often hard to encode known physical laws (such as Newton’s gravitation, or the ideal gas law) as causal graphs. In familiar machine learning territory, how does one model the causal relationships between individual pixels and a target prediction? This is one of the motivating questions behind the paper Invariant Risk Minimization (IRM). In place of structured graphs, the authors elevate invariance to the defining feature of causality.

He commends the Cloudera Fast Forward tutorial Causality for Machine Learning, which is a nice bit of applied work.

## 2 Causality for feedback and continuous fields

## 3 Double learning

See Double learning.

## 4 As “Deep Causality”

Not sure what this is yet (Berrevoets et al. 2024; Deng et al. 2022; Lagemann et al. 2023).

## 5 Benchmarking

Detecting causal associations in time series datasets is a key challenge for novel insights into complex dynamical systems such as the Earth system or the human brain. Interactions in such systems present a number of major challenges for causal discovery techniques and it is largely unknown which methods perform best for which challenge.

The CauseMe platform provides ground truth benchmark datasets featuring different real data challenges to assess and compare the performance of causal discovery methods. The available benchmark datasets are either generated from synthetic models mimicking real challenges, or are real world data sets where the causal structure is known with high confidence. The datasets vary in dimensionality, complexity and sophistication.

## 6 Tooling

### 6.1 Dowhy

### 6.2 Causalnex

CausalNex is a Python library that uses Bayesian Networks to combine machine learning and domain expertise for causal reasoning. You can use CausalNex to uncover structural relationships in your data, learn complex distributions, and observe the effect of potential interventions.

### 6.3 caus2e

The main contribution of cause2e is the integration of two established causal packages that have currently been separated and cumbersome to combine:

- Causal discovery methods from the py-causal package, which is a Python wrapper around parts of the Java TETRAD software. It provides many algorithms for learning the causal graph from data and domain knowledge.
- Causal reasoning methods from the DoWhy package, which is the current standard for the steps of a causal analysis starting from a known causal graph and data

### 6.4 TETRAD

TETRAD (source, tutorial) is a tool for discovering and visualising and calculating giant empirical DAGs, including general graphical inference and causality. It’s written by eminent causality inference people.

Tetrad is a program which creates, simulates data from, estimates, tests, predicts with, and searches for causal and statistical models. The aim of the program is to provide sophisticated methods in a friendly interface requiring very little statistical sophistication of the user and no programming knowledge. It is not intended to replace flexible statistical programming systems such as Matlab, Splus or R. Tetrad is freeware that performs many of the functions in commercial programs such as Netica, Hugin, LISREL, EQS and other programs, and many discovery functions these commercial programs do not perform. …

The Tetrad programs describe causal models in three distinct parts or stages: a picture, representing a directed graph specifying hypothetical causal relations among the variables; a specification of the family of probability distributions and kinds of parameters associated with the graphical model; and a specification of the numerical values of those parameters.

py-causal is a wrapper around TETRAD for python, and R-causal for R.

## 7 Incoming

- Nisha Muktewar and Chris Wallace, Causality for Machine Learning is the book Bottou recommends on this theme.
- For coders, Ben Dickson writes on Why machine learning struggles with causality.
- Cheng Soon Ong recommends Finn Lattimore to me as an important perspective.
- biomedia-mira/deepscm: Repository for Deep Structural Causal Models for Tractable Counterfactual Inference (Pawlowski, Coelho de Castro, and Glocker 2020).
- ICML 2022 Tutorial on causality and deep learning
- Causality and Deep Learning: Synergies, Challenges and the Future Tutorial

## 8 References

*arXiv:1902.07409 [Stat]*.

*Probabilistic and Causal Inference: The Works of Judea Pearl*.

*arXiv:1812.03253 [Cs, Stat]*.

*Frontiers in Psychology*.

*arXiv:1611.06221 [Cs, Stat]*.

*arXiv:1803.08784 [Cs, Stat]*.

*arXiv:1611.06221 [Cs, Stat]*.

*Artificial Neural Networks and Structural Equation Modeling: Marketing and Consumer Research Applications*.

*arXiv:2104.04103 [Cs, Stat]*.

*arXiv:2009.09070 [Cs]*.

*arXiv:1909.10893 [Cs, Stat]*.

*Proceedings of the 34th International Conference on Machine Learning*.

*Chaos: An Interdisciplinary Journal of Nonlinear Science*.

*Advances in Neural Information Processing Systems 29*.

*Journal of Artificial Intelligence Research*.

*arXiv:1709.02023 [Cs, Math, Stat]*.

*Proceedings of the National Academy of Sciences*.

*Nature Machine Intelligence*.

*arXiv:2006.07796 [Cs, Stat]*.

*Advances in Neural Information Processing Systems*.

*Journal of Hydrometeorology*.

*arXiv:1811.12359 [Cs, Stat]*.

*Proceedings of the 37th International Conference on Machine Learning*.

*Advances in Neural Information Processing Systems 30*.

*arXiv:2102.12353 [Cs, Stat]*.

*Proceedings of the 39th International Conference on Machine Learning*.

*arXiv:2109.00173 [Cs, Stat]*.

*Journal of Machine Learning Research*.

*arXiv:1910.08527 [Cs, Stat]*.

*Advances In Neural Information Processing Systems*.

*arXiv:2110.10819 [Cs]*.

*Advances in Neural Information Processing Systems*.

*Journal of the Royal Statistical Society Series B: Statistical Methodology*.

*Elements of Causal Inference: Foundations and Learning Algorithms*. Adaptive Computation and Machine Learning Series.

*Journal of the Royal Statistical Society Series C: Applied Statistics*.

*Proceedings of the 27th ACM International Conference on Information and Knowledge Management*. CIKM ’18.

*IEEE Access*.

*Journal of Machine Learning Research*.

*Uncertainty in Artificial Intelligence*.

*Nature Communications*.

*Probabilistic and Causal Inference: The Works of Judea Pearl*.

*Proceedings of the IEEE*.

*arXiv:1606.03976 [Cs, Stat]*.

*Proceedings of the 33rd International Conference on Neural Information Processing Systems*.

*Advances in Neural Information Processing Systems*.

*ACM Computing Surveys*.

*Proceedings of the AAAI Conference on Artificial Intelligence*.

*arXiv:2109.03795 [Cs, Stat]*.

*Frontiers in Artificial Intelligence*.

*arXiv:2004.08697 [Cs, Stat]*.

*Advances in Neural Information Processing Systems*.

*arXiv:2010.07684 [Cs]*.