0

Regularized ff-Divergence Kernel Tests

Mónica Ribero
Antonin Schrab
Arthur Gretton
Main:7 Pages
10 Figures
Bibliography:4 Pages
7 Tables
Appendix:26 Pages
Abstract

We propose a framework to construct practical kernel-based two-sample tests from the family of ff-divergences. The test statistic is computed from the witness function of a regularized variational representation of the divergence, which we estimate using kernel methods. The proposed test is adaptive over hyperparameters such as the kernel bandwidth and the regularization parameter. We provide theoretical guarantees for statistical test power across our family of ff-divergence estimates. While our test covers a variety of ff-divergences, we bring particular focus to the Hockey-Stick divergence, motivated by its applications to differential privacy auditing and machine unlearning evaluation. For two-sample testing, experiments demonstrate that different ff-divergences are sensitive to different localized differences, illustrating the importance of leveraging diverse statistics. For machine unlearning, we propose a relative test that distinguishes true unlearning failures from safe distributional variations.

View on arXiv
Comments on this paper