27

Forget by Uncertainty: Orthogonal Entropy Unlearning for Quantized Neural Networks

Tian Zhang
Yujia Tong
Junhao Dong
Ke Xu
Yuze Wang
Jingling Yuan
Main:8 Pages
5 Figures
Bibliography:2 Pages
8 Tables
Appendix:11 Pages
Abstract

The deployment of quantized neural networks on edge devices, combined with privacy regulations like GDPR, creates an urgent need for machine unlearning in quantized models. However, existing methods face critical challenges: they induce forgetting by training models to memorize incorrect labels, conflating forgetting with misremembering, and employ scalar gradient reweighting that cannot resolve directional conflicts between gradients. We propose OEU, a novel Orthogonal Entropy Unlearning framework with two key innovations: 1) Entropy-guided unlearning maximizes prediction uncertainty on forgotten data, achieving genuine forgetting rather than confident misprediction, and 2) Gradient orthogonal projection eliminates interference by projecting forgetting gradients onto the orthogonal complement of retain gradients, providing theoretical guarantees for utility preservation under first-order approximation. Extensive experiments demonstrate that OEU outperforms existing methods in both forgetting effectiveness and retain accuracy.

View on arXiv
Comments on this paper