ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.12034
20
0
v1v2 (latest)

Human-like Forgetting Curves in Deep Neural Networks

22 May 2025
Dylan Kline
    HAI
ArXiv (abs)PDFHTML
Main:6 Pages
3 Figures
Bibliography:2 Pages
Abstract

This study bridges cognitive science and neural network design by examining whether artificial models exhibit human-like forgetting curves. Drawing upon Ebbinghaus' seminal work on memory decay and principles of spaced repetition, we propose a quantitative framework to measure information retention in neural networks. Our approach computes the recall probability by evaluating the similarity between a network's current hidden state and previously stored prototype representations. This retention metric facilitates the scheduling of review sessions, thereby mitigating catastrophic forgetting during deployment and enhancing training efficiency by prompting targeted reviews. Our experiments with Multi-Layer Perceptrons reveal human-like forgetting curves, with knowledge becoming increasingly robust through scheduled reviews. This alignment between neural network forgetting curves and established human memory models identifies neural networks as an architecture that naturally emulates human memory decay and can inform state-of-the-art continual learning algorithms.

View on arXiv
@article{kline2025_2506.12034,
  title={ Human-like Forgetting Curves in Deep Neural Networks },
  author={ Dylan Kline },
  journal={arXiv preprint arXiv:2506.12034},
  year={ 2025 }
}
Comments on this paper