ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2204.12010
  4. Cited By
Theoretical Understanding of the Information Flow on Continual Learning
  Performance

Theoretical Understanding of the Information Flow on Continual Learning Performance

26 April 2022
Joshua Andle
Salimeh Yasaei Sekeh
    CLL
ArXivPDFHTML

Papers citing "Theoretical Understanding of the Information Flow on Continual Learning Performance"

5 / 5 papers shown
Title
Information Consistent Pruning: How to Efficiently Search for Sparse Networks?
Soheil Gharatappeh
Salimeh Yasaei Sekeh
59
0
0
28 Jan 2025
Towards Explaining Deep Neural Network Compression Through a Probabilistic Latent Space
Towards Explaining Deep Neural Network Compression Through a Probabilistic Latent Space
Mahsa Mozafari-Nia
Salimeh Yasaei Sekeh
25
0
0
29 Feb 2024
Formalizing the Generalization-Forgetting Trade-off in Continual
  Learning
Formalizing the Generalization-Forgetting Trade-off in Continual Learning
Krishnan Raghavan
Prasanna Balaprakash
CLL
59
30
0
28 Sep 2021
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Chelsea Finn
Pieter Abbeel
Sergey Levine
OOD
496
11,727
0
09 Mar 2017
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
310
2,896
0
15 Sep 2016
1