ResearchTrend.AI
  • Papers
  • Communities
  • Organizations
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2005.07107
  4. Cited By
Natural Way to Overcome the Catastrophic Forgetting in Neural Networks
v1v2 (latest)

Natural Way to Overcome the Catastrophic Forgetting in Neural Networks

27 April 2020
Alexey Kutalev
    CLL
ArXiv (abs)PDFHTML

Papers citing "Natural Way to Overcome the Catastrophic Forgetting in Neural Networks"

5 / 5 papers shown
Title
Continual Learning With Quasi-Newton Methods
Continual Learning With Quasi-Newton Methods
Steven Vander Eeckt
Hugo Van hamme
CLLBDL
135
0
0
25 Mar 2025
CSTRL: Context-Driven Sequential Transfer Learning for Abstractive Radiology Report Summarization
CSTRL: Context-Driven Sequential Transfer Learning for Abstractive Radiology Report Summarization
Mst. Fahmida Sultana Naznin
Adnan Ibney Faruq
Mostafa Rifat Tazwar
Md Jobayer
Md. Mehedi Hasan Shawon
Md Rakibul Hasan
MedIm
72
0
0
21 Feb 2025
Correlation of the importances of neural network weights calculated by
  modern methods of overcoming catastrophic forgetting
Correlation of the importances of neural network weights calculated by modern methods of overcoming catastrophic forgetting
Alexey Kutalev
53
0
0
24 Oct 2022
Empirical investigations on WVA structural issues
Empirical investigations on WVA structural issues
Alexey Kutalev
A. Lapina
CLL
49
1
0
11 Aug 2022
Stabilizing Elastic Weight Consolidation method in practical ML tasks
  and using weight importances for neural network pruning
Stabilizing Elastic Weight Consolidation method in practical ML tasks and using weight importances for neural network pruning
Alexey Kutalev
A. Lapina
81
11
0
21 Sep 2021
1