Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2404.00264
Cited By
DiLM: Distilling Dataset into Language Model for Text-level Dataset Distillation
30 March 2024
Aru Maekawa
Satoshi Kosugi
Kotaro Funakoshi
Manabu Okumura
DD
Re-assign community
ArXiv (abs)
PDF
HTML
Github (21★)
Papers citing
"DiLM: Distilling Dataset into Language Model for Text-level Dataset Distillation"
5 / 5 papers shown
Title
Approximating Language Model Training Data from Weights
John X. Morris
Junjie Oscar Yin
Woojeong Kim
Vitaly Shmatikov
Alexander M. Rush
40
0
0
18 Jun 2025
CONCORD: Concept-Informed Diffusion for Dataset Distillation
Jianyang Gu
Haonan Wang
Ruoxi Jia
Saeed Vahidian
Vyacheslav Kungurtsev
Wei Jiang
Yiran Chen
DiffM
DD
922
0
0
23 May 2025
Transferable text data distillation by trajectory matching
Rong Yao
Hailin Hu
Yifei Fu
Hanting Chen
Wenyi Fang
Fanyi Du
Kai Han
Yunhe Wang
96
0
0
14 Apr 2025
Synthetic Text Generation for Training Large Language Models via Gradient Matching
Dang Nguyen
Zeman Li
M. Bateni
Vahab Mirrokni
Meisam Razaviyayn
Baharan Mirzasoleiman
106
2
0
24 Feb 2025
On Learning Representations for Tabular Data Distillation
Inwon Kang
Parikshit Ram
Yi Zhou
Horst Samulowitz
Oshani Seneviratne
DD
117
0
0
23 Jan 2025
1