46
0

Towards Robust Influence Functions with Flat Validation Minima

Main:10 Pages
3 Figures
Bibliography:4 Pages
13 Tables
Appendix:7 Pages
Abstract

The Influence Function (IF) is a widely used technique for assessing the impact of individual training samples on model predictions. However, existing IF methods often fail to provide reliable influence estimates in deep neural networks, particularly when applied to noisy training data. This issue does not stem from inaccuracies in parameter change estimation, which has been the primary focus of prior research, but rather from deficiencies in loss change estimation, specifically due to the sharpness of validation risk. In this work, we establish a theoretical connection between influence estimation error, validation set risk, and its sharpness, underscoring the importance of flat validation minima for accurate influence estimation. Furthermore, we introduce a novel estimation form of Influence Function specifically designed for flat validation minima. Experimental results across various tasks validate the superiority of our approach.

View on arXiv
@article{ye2025_2505.19097,
  title={ Towards Robust Influence Functions with Flat Validation Minima },
  author={ Xichen Ye and Yifan Wu and Weizhong Zhang and Cheng Jin and Yifan Chen },
  journal={arXiv preprint arXiv:2505.19097},
  year={ 2025 }
}
Comments on this paper