Papers
Communities
Organizations
Events
Blog
Pricing
Search
Open menu
Home
Papers
2206.02916
Cited By
v1
v2 (latest)
Remember the Past: Distilling Datasets into Addressable Memories for Neural Networks
6 June 2022
Zhiwei Deng
Olga Russakovsky
FedML
DD
Re-assign community
ArXiv (abs)
PDF
HTML
Github (39★)
Papers citing
"Remember the Past: Distilling Datasets into Addressable Memories for Neural Networks"
22 / 22 papers shown
Title
A Large-Scale Study on Video Action Dataset Condensation
Yang Chen
Sheng Guo
Bo Zheng
Limin Wang
DD
188
3
0
13 Mar 2025
Distilling Dataset into Neural Field
DongHyeok Shin
Heesun Bae
Gyuwon Sim
Wanmo Kang
Il-Chul Moon
DD
160
1
0
05 Mar 2025
Dataset Distillation-based Hybrid Federated Learning on Non-IID Data
Xiufang Shi
Wei Zhang
Mincheng Wu
Guangyi Liu
Z. Wen
Shibo He
Tejal Shah
R. Ranjan
DD
FedML
93
1
0
26 Sep 2024
Behaviour Distillation
Andrei Lupu
Chris Xiaoxuan Lu
Jarek Liesen
R. T. Lange
Jakob Foerster
DD
114
4
0
21 Jun 2024
A Label is Worth a Thousand Images in Dataset Distillation
Tian Qin
Zhiwei Deng
David Alvarez-Melis
DD
190
13
0
15 Jun 2024
GIFT: Unlocking Full Potential of Labels in Distilled Dataset at Near-zero Cost
Xinyi Shang
Peng Sun
Tao Lin
137
3
0
23 May 2024
Disentangled Condensation for Large-scale Graphs
Zhenbang Xiao
Shunyu Liu
Yu Wang
Tongya Zheng
Mingli Song
Mingli Song
Tongya Zheng
DD
265
7
0
18 Jan 2024
Dancing with Still Images: Video Distillation via Static-Dynamic Disentanglement
Ziyu Wang
Yue Xu
Cewu Lu
Yong-Lu Li
DD
113
12
0
01 Dec 2023
Dataset Distillation via the Wasserstein Metric
Haoyang Liu
Yijiang Li
Tiancheng Xing
Peiran Wang
Vibhu Dalal
Luwei Li
Jingrui He
Haohan Wang
DD
152
14
0
30 Nov 2023
Efficient Dataset Distillation via Minimax Diffusion
Jianyang Gu
Saeed Vahidian
Vyacheslav Kungurtsev
Haonan Wang
Wei Jiang
Yang You
Yiran Chen
DD
110
35
0
27 Nov 2023
Frequency Domain-based Dataset Distillation
DongHyeok Shin
Seungjae Shin
Il-Chul Moon
DD
112
19
0
15 Nov 2023
Leveraging Hierarchical Feature Sharing for Efficient Dataset Condensation
Haizhong Zheng
Jiachen Sun
Shutong Wu
B. Kailkhura
Zhuoqing Mao
Chaowei Xiao
Atul Prakash
DD
69
2
0
11 Oct 2023
Vision-Language Dataset Distillation
Xindi Wu
Byron Zhang
Zhiwei Deng
Olga Russakovsky
DD
VLM
102
10
0
15 Aug 2023
Rethinking Data Distillation: Do Not Overlook Calibration
Dongyao Zhu
Bowen Lei
Jie M. Zhang
Yanbo Fang
Ruqi Zhang
Yiqun Xie
Dongkuan Xu
DD
FedML
97
16
0
24 Jul 2023
The Importance of Robust Features in Mitigating Catastrophic Forgetting
Hikmat Khan
N. Bouaynaya
Ghulam Rasool
83
7
0
29 Jun 2023
Squeeze, Recover and Relabel: Dataset Condensation at ImageNet Scale From A New Perspective
Zeyuan Yin
Eric P. Xing
Zhiqiang Shen
DD
105
84
0
22 Jun 2023
Repeated Random Sampling for Minimizing the Time-to-Accuracy of Learning
Patrik Okanovic
R. Waleffe
Vasilis Mageirakos
Konstantinos E. Nikolakakis
Amin Karbasi
Dionysis Kalogerias
Nezihe Merve Gürel
Theodoros Rekatsinas
DD
109
14
0
28 May 2023
Distill Gold from Massive Ores: Efficient Dataset Distillation via Critical Samples Selection
Yue Xu
Yong-Lu Li
Kaitong Cui
Ziyu Wang
Cewu Lu
Yu-Wing Tai
Chi-Keung Tang
DD
134
8
0
28 May 2023
Dataset Distillation: A Comprehensive Review
Ruonan Yu
Songhua Liu
Xinchao Wang
DD
176
132
0
17 Jan 2023
A Comprehensive Survey of Dataset Distillation
Shiye Lei
Dacheng Tao
DD
137
94
0
13 Jan 2023
Data Distillation: A Survey
Noveen Sachdeva
Julian McAuley
DD
124
79
0
11 Jan 2023
Scaling Up Dataset Distillation to ImageNet-1K with Constant Memory
Justin Cui
Ruochen Wang
Si Si
Cho-Jui Hsieh
DD
138
145
0
19 Nov 2022
1