ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.09394
35
0

Review Learning: Alleviating Catastrophic Forgetting with Generative Replay without Generator

17 October 2022
Jaesung Yoo
Sung-Hyuk Choi
Yewon Yang
Suhyeon Kim
J. Choi
Dongkyeong Lim
Yaeji Lim
H. J. Joo
Dae-Jung Kim
R. Park
Hyeong-Jin Yoon
Kwangsoo Kim
    KELM
    OffRL
ArXivPDFHTML
Abstract

When a deep learning model is sequentially trained on different datasets, it forgets the knowledge acquired from previous data, a phenomenon known as catastrophic forgetting. It deteriorates performance of the deep learning model on diverse datasets, which is critical in privacy-preserving deep learning (PPDL) applications based on transfer learning (TL). To overcome this, we propose review learning (RL), a generative-replay-based continual learning technique that does not require a separate generator. Data samples are generated from the memory stored within the synaptic weights of the deep learning model which are used to review knowledge acquired from previous datasets. The performance of RL was validated through PPDL experiments. Simulations and real-world medical multi-institutional experiments were conducted using three types of binary classification electronic health record data. In the real-world experiments, the global area under the receiver operating curve was 0.710 for RL and 0.655 for TL. Thus, RL was highly effective in retaining previously learned knowledge.

View on arXiv
Comments on this paper