31
0

Understand the Effect of Importance Weighting in Deep Learning on Dataset Shift

Abstract

We evaluate the effectiveness of importance weighting in deep neural networks under label shift and covariate shift. On synthetic 2D data (linearly separable and moon-shaped) using logistic regression and MLPs, we observe that weighting strongly affects decision boundaries early in training but fades with prolonged optimization. On CIFAR-10 with various class imbalances, only L2 regularization (not dropout) helps preserve weighting effects. In a covariate-shift experiment, importance weighting yields no significant performance gain, highlighting challenges on complex data. Our results call into question the practical utility of importance weighting for real-world distribution shifts.

View on arXiv
@article{vo2025_2505.03617,
  title={ Understand the Effect of Importance Weighting in Deep Learning on Dataset Shift },
  author={ Thien Nhan Vo and Thanh Xuan Truong },
  journal={arXiv preprint arXiv:2505.03617},
  year={ 2025 }
}
Comments on this paper