# CausalML Lab

Causal reasoning is essential for artificial intelligence and machine learning. In CausalML lab, we develop new theoretical results that give us insights about fundamental causal discovery and inference problems, and develop novel algorithms based on these insights. Our research can be broadly categorized into multiple pillars

*i) Practical and Approximate Causal Reasoning via Information-theoretic Methods,*

*ii) Causal Machine Learning,*

*iii) Fundamentals of Causal Inference and Discovery,*

*iv) High-dimensional Causal Inference with Deep Generative Models.*

# Current Members

*PhD Students*

# Projects

Our groupâ€™s research is focused on developing fundamental algorithms for causal discovery and inference from data, and exploring the connections between causality and machine learning, information theory, graph theory, deep learning, online learning. Some threads we focus on are as follows.

*Practical and Approximate Causal Reasoning via Information-theoretic Methods*

Information-theoretic tools provide us with ways to go beyond the boundaries of classical causal reasoning algorithms and make approximate causal inferences under practical assumptions.

### Related publications:

- Z. Jiang, L. Wei, M. Kocaoglu, â€śApproximate Causal Effect Identification under Weak Confounding,â€ť in Proc. of
**ICMLâ€™23**, Honolulu, HI, USA, July 2023. - S. Compton, D. Katz, B. Qi, K. Greenewald, M. Kocaoglu, â€śMinimum-Entropy Coupling Approximation Guarantees Beyond the Majorization Barrier,â€ť in Proc. of
**AISTATSâ€™23**, Valencia, Spain, April 2023. - S. Compton, K. Greenewald, D. Katz, M. Kocaoglu, â€śEntropic Causal Inference: Graph Identifiabilityâ€ť, in Proc. of
**ICMLâ€™22**, July 2022. - S. Compton, M. Kocaoglu, Kristjan Greenewald, Dmitriy Katz, â€śEntropic Causal Inference: Identifiability and Finite Sample Results,â€ť in Proc. of
**NeurIPSâ€™20**, Online, Dec. 2020. - M. Kocaoglu, S. Shakkottai, A. G. Dimakis, C. Caramanis, S. Vishwanath, â€śApplications of Common Entropy for Causal Inference,â€ť in Proc. of
**NeurIPSâ€™20**, Online, Dec. 2020. - M. Kocaoglu, A. G. Dimakis, S. Vishwanath, B. Hassibi, â€śEntropic Causality and Greedy Minimum Entropy Coupling,â€ť in Proc. of
**ISITâ€™17**, 2017. - M. Kocaoglu, A. G. Dimakis, S. Vishwanath, B. Hassibi, â€śEntropic Causal Inference,â€ť in Proc. of
**AAAIâ€™17**, San Francisco, USA, Feb. 2017.

*Causal Machine Learning*

We explore ways in which causal inference and discovery can help more robust and practical machine learning solutions.

### Related publications

- Sean Kulinski, Zeyu Zhou, Ruqi Bai, Murat Kocaoglu, David I. Inouye Towards Characterizing Domain Counterfactuals for Invertible Latent Causal Modelsâ€ť to appear in Proc. of
**ICLRâ€™24**, Vienna, Austria, May 2024. - L. Wei, M. Q. Elahi, M. Ghasemi, M. Kocaoglu, â€śApproximate Allocation Matching for Structural Causal Bandits with Unobserved Confounders,â€ť in Proc. of
**NeurIPSâ€™23**, New Orleans, LA, USA, Dec. 2023. - K. Lee, M. M. Rahman, M. Kocaoglu, â€śFinding Invariant Predictors Efficiently via Causal Structure,â€ť in Proc. of
**UAIâ€™23**, Pittsburgh, PA, USA, Aug. 2023. - M. A. Ikram, S. Chakraborty, S. Mitra, S. Saini, S. Bagchi, M. Kocaoglu, â€śRoot Cause Analysis of Failures in Microservices through Causal Discovery,â€ť in Proc. of
**NeurIPSâ€™22**, Dec. 2022. - K. Ahuja, P. Sattigeri, K. Shanmugam, D. Wei, K. N. Ramamurthy, M. Kocaoglu, â€śConditionally Independent Data Generationâ€ť, in Proc. of
**UAIâ€™21**, 2021. - R. Sen, K. Shanmugam, M. Kocaoglu, A. G. Dimakis, S. Shakkottai, â€śContextual Bandits with Latent Confounders: An NMF Approach,â€ť in Proc. of
**AISTATSâ€™17**, 2017.

*Fundamentals of Causal Inference and Discovery*

We are interested in developing a fundamental understanding of how much causal knowledge can be extracted from data under well-defined assumptions. The cross-cutting nature of causal inference makes this a challenging problem with different constraints coming from different fields. For example, we can set up a randomized controlled trial but the number of such experiments needs to be small since interventions are costly. In other domains, interventional data may already have been collected for different purposes such as average treatment effect estimation, and the goal would be to repurpose them for causal discovery. In certain contexts conducting any experiments might be infeasible, and the observational data may be very noisy or contain only a small number of samples. We develop fundamental bounds on how much causal knowledge is contained in such data, and the associated sound and complete learning algorithms.

### Related publications

- M. Kocaoglu, â€śCharacterization and Learning of Causal Graphs with Small Conditioning Sets,â€ť in Proc. of
**NeurIPSâ€™23**, New Orleans, LA, 2023. - A. Shah, K. Shanmugam, M. Kocaoglu, â€śFront-door Adjustment Beyond Markov Equivalence with Limited Graph Knowledge,â€ť in Proc. of
**NeurIPSâ€™23**, New Orleans, LA, USA, Dec. 2023. - S. Gao, R. Addanki, T. Yu, R. A. Rossi, M. Kocaoglu, â€śCausal Discovery in Semi-Stationary Time Series,â€ť in Proc. of
**NeurIPSâ€™23**, New Orleans, LA, USA, Dec. 2023. - C. Squires, S. Magliacane, K. Greenewald, D. Katz, M. Kocaoglu, K. Shanmugam, â€śActive Structure Learning of Causal DAGs via Directed Clique Trees,â€ť in Proc. of
**NeurIPSâ€™20**, Online, Dec. 2020. - K. Greenewald, D. Katz, K. Shanmugam, S. Magliacane, M. Kocaoglu, E. B. Adsera, G. Bresler, â€śSample Efficient Active Learning of Causal Trees,â€ť in Proc. of
**NeurIPSâ€™19**, Vancouver, Canada, Dec. 2019. - A. Jaber, M. Kocaoglu, K. Shanmugam, E. Bareinboim, â€śCausal Discovery from Soft Interventions with Unknown Targets: Characterization and Learning,â€ť in Proc. of
**NeurIPSâ€™20**, Online, Dec. 2020. - M. Kocaoglu*, A. Jaber*, K. Shanmugam*, E. Bareinboim, â€śCharacterization and Learning of Causal Graphs with Latent Variables from Soft Interventions,â€ť in Proc. of
**NeurIPSâ€™19**, Vancouver, Canada, Dec. 2019. - E. Lindgren, M. Kocaoglu, A. G. Dimakis, S. Vishwanath, â€śExperimental Design for Cost-Aware Learning of Causal Graphsâ€ť in Proc. of
**NeurIPSâ€™18**, Montreal, Canada, Dec. 2018. - E. Lindgren, M. Kocaoglu, A. G. Dimakis, S. Vishwanath, â€śSubmodularity and Minimum Cost Intervention Design for Learning Causal Graphs,â€ť in
**DISCMLâ€™17 Workshop in NIPSâ€™17**, Dec. 2017. - M. Kocaoglu, K. Shanmugam, E. Bareinboim, â€śExperimental Design for Learning Causal Graphs with Latent Variables,â€ť in Proc. of
**NeurIPSâ€™17**, 2017. - M. Kocaoglu, A. G. Dimakis, S. Vishwanath, â€śCost-Optimal Learning of Causal Graphs,â€ť in Proc. of
**ICMLâ€™17**, 2017. - M. Kocaoglu, A. G. Dimakis, S. Vishwanath, â€śLearning Causal Graphs with Constraints,â€ť in
**NeurIPSâ€™16 Workshop**: What If? Inference and Learning of Hypothetical and Counterfactual Interventions in Complex Systems, Barcelona, Spain, Dec. 2016. - K. Shanmugam, M. Kocaoglu, A. G. Dimakis, S. Vishwanath, â€śLearning Causal Graphs with Small Interventions,â€ť in Proc. of
**NeurIPSâ€™15**, Montreal, Canada, Dec. 2015.

*High-dimensional Causal Inference with Deep Generative Models*

We are interested in leveraging the representation capabilities of deep neural networks to enable sampling from causal queries in the presence of high-dimensional variables such as images.

### Related Publications

- M. M. Rahman*, M. Jordan*, M. Kocaoglu, â€śConditional Generative Models are Sufficient to Sample from Any Causal Effect Estimand,â€ť arXiv, 2024.
- M. M. Rahman, M. Kocaoglu, â€śModular Learning of Deep Causal Generative Models for High-dimensional Causal Inference,â€ť arXiv, 2024.
- M. Kocaoglu*, C. Snyder*, A. G. Dimakis, S. Vishwanath, â€śCausalGAN: Learning Causal Implicit Generative Models with Adversarial Training,â€ť in Proc. of
**ICLRâ€™18**, Vancouver, Canada, May 2018.

# Past Members

## PostDoc Researchers

- Lai Wei,
*August 2022 - August 2023*đź“„

## Visiting Researcher

- Suyeong Park,
*July - August 2022*đź“„

# Acknowledgement

Our lab is currently supported by funding from the National Science Foundation (NSF), and Adobe Research.