Research

I am an Assistant Professor in The Elmore Family School of Electrical and Computer Engineering and in the Department of Statistics (by courtesy) at Purdue University. My current research focuses on developing fundamental causal inference and causal discovery algorithms and how to integrate them into causal machine learning solutions.

News

New! Our lab received the Amazon Research Award in AI for Information Security!

New! Our paper titled “Towards Characterizing Domain Counterfactuals for Invertible Latent Causal Models” is accepted at ICLR’24 here.

New! Our paper titled “Characterization and Learning of Causal Graphs with Small Conditioning Sets” is accepted at NeurIPS’23 here.

New! Our paper titled “Front-door Adjustment Beyond Markov Equivalence with Limited Graph Knowledge” is accepted at NeurIPS’23 here.

New! Our paper titled “Approximate Allocation Matching for Structural Causal Bandits with Unobserved Confounders” is accepted at NeurIPS’23 here.

New! Our paper titled “Causal Discovery in Semi-Stationary Time Series” is accepted at NeurIPS’23 here.

Our paper titled “Finding Invariant Predictors Efficiently via Causal Structure” is accepted at UAI’23 here.

Our paper titled “Approximate Causal Effect Identification under Weak Confounding” is accepted at ICML’23 here.

I am the guest editor for the special issue of Entropy on “Information-theoretic Methods for Causal Inference and Discovery” here. Please submit your papers on the topic!

Our lab received the NSF CAREER Award!

Our paper titled “Minimum-Entropy Coupling Approximation Guarantees Beyond the Majorization Barrier” is accepted at AISTATS’23 here.

Our paper titled “Root Cause Analysis of Failures in Microservices through Causal Discovery” is accepted at NeurIPS’22 here.

Our paper titled “Entropic Causal Inference: Graph Identifiability” is accepted at ICML’22. Preprint coming soon.

Our lab received the Adobe Data Science Research Award for the project Causal Discovery for Root Cause Analysis.

We are organizing a AAAI Workshop on the Information-Theoretic Methods for Causal Inference and Discovery! Submit your papers at the intersection of causal inference/discovery, information theory and machine learning. See the workshop website for more details and submission instructions.

Our paper titled “Entropic Causal Inference: Identifiability for Trees and Complete Graphs” is accepted at the ICML-21 Workshop on Information-Theoretic Methods for Rigorous, Responsible, and Reliable Machine Learning - ITR3.

Our paper titled “Conditionally Independent Data Generation” is accepted at UAI’21, preprint is available here.

I will be teaching a research course on causality called Probabilistic Causal Inference in Fall’21. It is open to anyone with a background in probability theory. Reach out to me if you are in Purdue and are interested to take it. See the course website and syllabus here.

Our paper titled “Applications of Common Entropy for Causal Inference” is accepted at NeurIPS’20, available here.

Our paper titled “Causal Discovery from Soft Interventions with Unknown Targets: Characterization and Learning” is accepted at NeurIPS’20, available here.

Our paper titled “Entropic Causal Inference: Identifiability and Finite Sample Results” is accepted at NeurIPS’20, available here.

Our paper titled “Active Structure Learning of Causal DAGs via Directed Clique Trees” is accepted at NeurIPS’20, available here.

Our paper titled “Characterization and Learning of Causal Graphs with Latent Variables from Soft Interventions” is accepted at NeurIPS’19 available here.

Our paper titled “Sample Efficient Active Learning of Causal Trees” is accepted at NeurIPS’19, available here.

I gave an invited talk in the WHY-19 Symposium on CausalGAN. Website and slides are here.

I gave a talk on the Shannon Channel on entropic methods for causal inference, you can watch it here.

Our paper titled “Experimental Design for Cost-Aware Learning of Causal Graphs” is accepted at NeurIPS’18. A preprint is available here.

A new preprint titled “Entropic Causal Inference with Latent Variables” is available here.

Our paper titled “CausalGAN: Learning Causal Implicit Generative Models with Adversarial Training” is accepted at ICLR’18. A preprint is available here.

Our paper titled “Experimental Design for Learning Causal Graphs with Latent Variables” is accepted at NIPS’17. A preprint is available here.

A new preprint titled “CausalGAN: Learning Causal Implicit Generative Models with Adversarial Training” is available here. Tensorflow implementation is available here.

Our paper titled “Cost-Optimal Learning of Causal Graphs” is accepted at ICML’17. A preprint is available here.

Our paper titled “Entropic Causality and Greedy Minimum Entropy Coupling” is accepted at ISIT 2017. A preprint is available here.

A new preprint titled “Sparse Quadratic Logistic Regression in Sub-quadratic Time” is available here.

A new preprint titled “Cost-Optimal Learning of Causal Graphs” is available here.

A new preprint titled “Entropic Causality and Greedy Minimum Entropy Coupling” is available here.

Our paper titled “Contextual Bandits with Latent Confounders: An NMF Approach” is accepted at AISTATS 2017. A preprint is available here.

Python code is available for the entropy-based causal inference algorithm of “Entropic Causal Inference” paper here.

Our paper titled “Entropic Causal Inference” is accepted at AAAI 2017. A preprint is available here.

Our paper titled “Learning Causal Graphs with Constraints” is accepted for a poster presentation in the NIPS 2016 workshop What if? Inference and Learning of Hypothetical and Counterfactual Interventions in Complex Systems, available here.

We have a preprint available on contextual bandits with unobserved confounders here.

Our paper titled “Learning Causal Graphs with Small Interventions” is accepted at NIPS 2015. A preprint is available here.

We are organizing a student seminar series within WNCG. You can reach the schedule here.

Our paper titled “Sparse Polynomial Learning and Graph Sketching” is accepted for oral presentation at NIPS 2014. A preprint is available here.