ResearchTrend.AI
  • Papers
  • Communities
  • Organizations
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.02916
  4. Cited By
Remember the Past: Distilling Datasets into Addressable Memories for
  Neural Networks
v1v2 (latest)

Remember the Past: Distilling Datasets into Addressable Memories for Neural Networks

6 June 2022
Zhiwei Deng
Olga Russakovsky
    FedMLDD
ArXiv (abs)PDFHTMLGithub (39★)

Papers citing "Remember the Past: Distilling Datasets into Addressable Memories for Neural Networks"

22 / 22 papers shown
Title
A Large-Scale Study on Video Action Dataset Condensation
A Large-Scale Study on Video Action Dataset Condensation
Yang Chen
Sheng Guo
Bo Zheng
Limin Wang
DD
188
3
0
13 Mar 2025
Distilling Dataset into Neural Field
DongHyeok Shin
Heesun Bae
Gyuwon Sim
Wanmo Kang
Il-Chul Moon
DD
160
1
0
05 Mar 2025
Dataset Distillation-based Hybrid Federated Learning on Non-IID Data
Dataset Distillation-based Hybrid Federated Learning on Non-IID Data
Xiufang Shi
Wei Zhang
Mincheng Wu
Guangyi Liu
Z. Wen
Shibo He
Tejal Shah
R. Ranjan
DDFedML
93
1
0
26 Sep 2024
Behaviour Distillation
Behaviour Distillation
Andrei Lupu
Chris Xiaoxuan Lu
Jarek Liesen
R. T. Lange
Jakob Foerster
DD
114
4
0
21 Jun 2024
A Label is Worth a Thousand Images in Dataset Distillation
A Label is Worth a Thousand Images in Dataset Distillation
Tian Qin
Zhiwei Deng
David Alvarez-Melis
DD
190
13
0
15 Jun 2024
GIFT: Unlocking Full Potential of Labels in Distilled Dataset at Near-zero Cost
GIFT: Unlocking Full Potential of Labels in Distilled Dataset at Near-zero Cost
Xinyi Shang
Peng Sun
Tao Lin
137
3
0
23 May 2024
Disentangled Condensation for Large-scale Graphs
Disentangled Condensation for Large-scale Graphs
Zhenbang Xiao
Shunyu Liu
Yu Wang
Tongya Zheng
Mingli Song
Mingli Song
Tongya Zheng
DD
265
7
0
18 Jan 2024
Dancing with Still Images: Video Distillation via Static-Dynamic
  Disentanglement
Dancing with Still Images: Video Distillation via Static-Dynamic Disentanglement
Ziyu Wang
Yue Xu
Cewu Lu
Yong-Lu Li
DD
113
12
0
01 Dec 2023
Dataset Distillation via the Wasserstein Metric
Dataset Distillation via the Wasserstein Metric
Haoyang Liu
Yijiang Li
Tiancheng Xing
Peiran Wang
Vibhu Dalal
Luwei Li
Jingrui He
Haohan Wang
DD
152
14
0
30 Nov 2023
Efficient Dataset Distillation via Minimax Diffusion
Efficient Dataset Distillation via Minimax Diffusion
Jianyang Gu
Saeed Vahidian
Vyacheslav Kungurtsev
Haonan Wang
Wei Jiang
Yang You
Yiran Chen
DD
110
35
0
27 Nov 2023
Frequency Domain-based Dataset Distillation
Frequency Domain-based Dataset Distillation
DongHyeok Shin
Seungjae Shin
Il-Chul Moon
DD
112
19
0
15 Nov 2023
Leveraging Hierarchical Feature Sharing for Efficient Dataset
  Condensation
Leveraging Hierarchical Feature Sharing for Efficient Dataset Condensation
Haizhong Zheng
Jiachen Sun
Shutong Wu
B. Kailkhura
Zhuoqing Mao
Chaowei Xiao
Atul Prakash
DD
69
2
0
11 Oct 2023
Vision-Language Dataset Distillation
Vision-Language Dataset Distillation
Xindi Wu
Byron Zhang
Zhiwei Deng
Olga Russakovsky
DDVLM
102
10
0
15 Aug 2023
Rethinking Data Distillation: Do Not Overlook Calibration
Rethinking Data Distillation: Do Not Overlook Calibration
Dongyao Zhu
Bowen Lei
Jie M. Zhang
Yanbo Fang
Ruqi Zhang
Yiqun Xie
Dongkuan Xu
DDFedML
97
16
0
24 Jul 2023
The Importance of Robust Features in Mitigating Catastrophic Forgetting
The Importance of Robust Features in Mitigating Catastrophic Forgetting
Hikmat Khan
N. Bouaynaya
Ghulam Rasool
83
7
0
29 Jun 2023
Squeeze, Recover and Relabel: Dataset Condensation at ImageNet Scale
  From A New Perspective
Squeeze, Recover and Relabel: Dataset Condensation at ImageNet Scale From A New Perspective
Zeyuan Yin
Eric P. Xing
Zhiqiang Shen
DD
105
84
0
22 Jun 2023
Repeated Random Sampling for Minimizing the Time-to-Accuracy of Learning
Repeated Random Sampling for Minimizing the Time-to-Accuracy of Learning
Patrik Okanovic
R. Waleffe
Vasilis Mageirakos
Konstantinos E. Nikolakakis
Amin Karbasi
Dionysis Kalogerias
Nezihe Merve Gürel
Theodoros Rekatsinas
DD
109
14
0
28 May 2023
Distill Gold from Massive Ores: Efficient Dataset Distillation via
  Critical Samples Selection
Distill Gold from Massive Ores: Efficient Dataset Distillation via Critical Samples Selection
Yue Xu
Yong-Lu Li
Kaitong Cui
Ziyu Wang
Cewu Lu
Yu-Wing Tai
Chi-Keung Tang
DD
134
8
0
28 May 2023
Dataset Distillation: A Comprehensive Review
Dataset Distillation: A Comprehensive Review
Ruonan Yu
Songhua Liu
Xinchao Wang
DD
176
132
0
17 Jan 2023
A Comprehensive Survey of Dataset Distillation
A Comprehensive Survey of Dataset Distillation
Shiye Lei
Dacheng Tao
DD
137
94
0
13 Jan 2023
Data Distillation: A Survey
Data Distillation: A Survey
Noveen Sachdeva
Julian McAuley
DD
124
79
0
11 Jan 2023
Scaling Up Dataset Distillation to ImageNet-1K with Constant Memory
Scaling Up Dataset Distillation to ImageNet-1K with Constant Memory
Justin Cui
Ruochen Wang
Si Si
Cho-Jui Hsieh
DD
138
145
0
19 Nov 2022
1