ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2203.08559
  4. Cited By
Learning to Generate Synthetic Training Data using Gradient Matching and
  Implicit Differentiation

Learning to Generate Synthetic Training Data using Gradient Matching and Implicit Differentiation

16 March 2022
Dmitry Medvedev
A. Dýakonov
    DD
ArXivPDFHTML

Papers citing "Learning to Generate Synthetic Training Data using Gradient Matching and Implicit Differentiation"

5 / 5 papers shown
Title
DiLM: Distilling Dataset into Language Model for Text-level Dataset
  Distillation
DiLM: Distilling Dataset into Language Model for Text-level Dataset Distillation
Aru Maekawa
Satoshi Kosugi
Kotaro Funakoshi
Manabu Okumura
DD
51
10
0
30 Mar 2024
DD-RobustBench: An Adversarial Robustness Benchmark for Dataset
  Distillation
DD-RobustBench: An Adversarial Robustness Benchmark for Dataset Distillation
Yifan Wu
Jiawei Du
Ping Liu
Yuewei Lin
Wenqing Cheng
Wei-ping Xu
DD
AAML
40
5
0
20 Mar 2024
M3D: Dataset Condensation by Minimizing Maximum Mean Discrepancy
M3D: Dataset Condensation by Minimizing Maximum Mean Discrepancy
Hansong Zhang
Shikun Li
Pengju Wang
Dan Zeng
Shiming Ge
DD
27
22
0
26 Dec 2023
AST: Effective Dataset Distillation through Alignment with Smooth and
  High-Quality Expert Trajectories
AST: Effective Dataset Distillation through Alignment with Smooth and High-Quality Expert Trajectories
Jiyuan Shen
Wenzhuo Yang
Kwok-Yan Lam
DD
35
1
0
16 Oct 2023
New Properties of the Data Distillation Method When Working With Tabular
  Data
New Properties of the Data Distillation Method When Working With Tabular Data
Dmitry Medvedev
A. Dýakonov
DD
16
9
0
19 Oct 2020
1