26
0

A Generative Framework for Causal Estimation via Importance-Weighted Diffusion Distillation

Abstract

Estimating individualized treatment effects from observational data is a central challenge in causal inference, largely due to covariate imbalance and confounding bias from non-randomized treatment assignment. While inverse probability weighting (IPW) is a well-established solution to this problem, its integration into modern deep learning frameworks remains limited. In this work, we propose Importance-Weighted Diffusion Distillation (IWDD), a novel generative framework that combines the pretraining of diffusion models with importance-weighted score distillation to enable accurate and fast causal estimation-including potential outcome prediction and treatment effect estimation. We demonstrate how IPW can be naturally incorporated into the distillation of pretrained diffusion models, and further introduce a randomization-based adjustment that eliminates the need to compute IPW explicitly-thereby simplifying computation and, more importantly, provably reducing the variance of gradient estimates. Empirical results show that IWDD achieves state-of-the-art out-of-sample prediction performance, with the highest win rates compared to other baselines, significantly improving causal estimation and supporting the development of individualized treatment strategies. We will release our PyTorch code for reproducibility and future research.

View on arXiv
@article{song2025_2505.11444,
  title={ A Generative Framework for Causal Estimation via Importance-Weighted Diffusion Distillation },
  author={ Xinran Song and Tianyu Chen and Mingyuan Zhou },
  journal={arXiv preprint arXiv:2505.11444},
  year={ 2025 }
}
Comments on this paper