ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2404.00264
  4. Cited By
DiLM: Distilling Dataset into Language Model for Text-level Dataset
  Distillation

DiLM: Distilling Dataset into Language Model for Text-level Dataset Distillation

30 March 2024
Aru Maekawa
Satoshi Kosugi
Kotaro Funakoshi
Manabu Okumura
    DD
ArXiv (abs)PDFHTMLGithub (21★)

Papers citing "DiLM: Distilling Dataset into Language Model for Text-level Dataset Distillation"

5 / 5 papers shown
Title
Approximating Language Model Training Data from Weights
Approximating Language Model Training Data from Weights
John X. Morris
Junjie Oscar Yin
Woojeong Kim
Vitaly Shmatikov
Alexander M. Rush
40
0
0
18 Jun 2025
CONCORD: Concept-Informed Diffusion for Dataset Distillation
CONCORD: Concept-Informed Diffusion for Dataset Distillation
Jianyang Gu
Haonan Wang
Ruoxi Jia
Saeed Vahidian
Vyacheslav Kungurtsev
Wei Jiang
Yiran Chen
DiffMDD
922
0
0
23 May 2025
Transferable text data distillation by trajectory matching
Transferable text data distillation by trajectory matching
Rong Yao
Hailin Hu
Yifei Fu
Hanting Chen
Wenyi Fang
Fanyi Du
Kai Han
Yunhe Wang
96
0
0
14 Apr 2025
Synthetic Text Generation for Training Large Language Models via Gradient Matching
Synthetic Text Generation for Training Large Language Models via Gradient Matching
Dang Nguyen
Zeman Li
M. Bateni
Vahab Mirrokni
Meisam Razaviyayn
Baharan Mirzasoleiman
106
2
0
24 Feb 2025
On Learning Representations for Tabular Data Distillation
On Learning Representations for Tabular Data Distillation
Inwon Kang
Parikshit Ram
Yi Zhou
Horst Samulowitz
Oshani Seneviratne
DD
117
0
0
23 Jan 2025
1