ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2412.16780
176
0
v1v2v3v4 (latest)

Forget Vectors at Play: Universal Input Perturbations Driving Machine Unlearning in Image Classification

21 December 2024
Changchang Sun
Ren Wang
Yihua Zhang
Jinghan Jia
Jiancheng Liu
Gaowen Liu
Sijia Liu
Yan Yan
    AAMLMU
ArXiv (abs)PDFHTML
Abstract

Machine unlearning (MU), which seeks to erase the influence of specific unwanted data from already-trained models, is becoming increasingly vital in model editing, particularly to comply with evolving data regulations like the ``right to be forgotten''. Conventional approaches are predominantly model-based, typically requiring retraining or fine-tuning the model's weights to meet unlearning requirements. In this work, we approach the MU problem from a novel input perturbation-based perspective, where the model weights remain intact throughout the unlearning process. We demonstrate the existence of a proactive input-based unlearning strategy, referred to forget vector, which can be generated as an input-agnostic data perturbation and remains as effective as model-based approximate unlearning approaches. We also explore forget vector arithmetic, whereby multiple class-specific forget vectors are combined through simple operations (e.g., linear combinations) to generate new forget vectors for unseen unlearning tasks, such as forgetting arbitrary subsets across classes. Extensive experiments validate the effectiveness and adaptability of the forget vector, showcasing its competitive performance relative to state-of-the-art model-based methods. Codes are available at https://github.com/Changchangsun/Forget-Vector.

View on arXiv
@article{sun2025_2412.16780,
  title={ Forget Vectors at Play: Universal Input Perturbations Driving Machine Unlearning in Image Classification },
  author={ Changchang Sun and Ren Wang and Yihua Zhang and Jinghan Jia and Jiancheng Liu and Gaowen Liu and Yan Yan and Sijia Liu },
  journal={arXiv preprint arXiv:2412.16780},
  year={ 2025 }
}
Comments on this paper