ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2401.05667
  4. Cited By
EsaCL: Efficient Continual Learning of Sparse Models

EsaCL: Efficient Continual Learning of Sparse Models

11 January 2024
Weijieying Ren
V. Honavar
    CLL
ArXivPDFHTML

Papers citing "EsaCL: Efficient Continual Learning of Sparse Models"

4 / 4 papers shown
Title
Multi-source Unsupervised Domain Adaptation on Graphs with
  Transferability Modeling
Multi-source Unsupervised Domain Adaptation on Graphs with Transferability Modeling
Tianxiang Zhao
Dongsheng Luo
Xiang Zhang
Suhang Wang
26
3
0
14 Jun 2024
SparCL: Sparse Continual Learning on the Edge
SparCL: Sparse Continual Learning on the Edge
Zifeng Wang
Zheng Zhan
Yifan Gong
Geng Yuan
Wei Niu
T. Jian
Bin Ren
Stratis Ioannidis
Yanzhi Wang
Jennifer Dy
CLL
63
58
0
20 Sep 2022
Sparsity in Deep Learning: Pruning and growth for efficient inference
  and training in neural networks
Sparsity in Deep Learning: Pruning and growth for efficient inference and training in neural networks
Torsten Hoefler
Dan Alistarh
Tal Ben-Nun
Nikoli Dryden
Alexandra Peste
MQ
141
684
0
31 Jan 2021
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
299
2,890
0
15 Sep 2016
1