ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.13782
  4. Cited By
Data-Free Knowledge Distillation Using Adversarially Perturbed OpenGL
  Shader Images

Data-Free Knowledge Distillation Using Adversarially Perturbed OpenGL Shader Images

20 October 2023
Logan Frank
Jim Davis
ArXivPDFHTML

Papers citing "Data-Free Knowledge Distillation Using Adversarially Perturbed OpenGL Shader Images"

2 / 2 papers shown
Title
CAE-DFKD: Bridging the Transferability Gap in Data-Free Knowledge Distillation
CAE-DFKD: Bridging the Transferability Gap in Data-Free Knowledge Distillation
Zherui Zhang
Changwei Wang
Rongtao Xu
Wenyuan Xu
Shibiao Xu
Yu Zhang
Li Guo
39
1
0
30 Apr 2025
Extracting Training Data from Large Language Models
Extracting Training Data from Large Language Models
Nicholas Carlini
Florian Tramèr
Eric Wallace
Matthew Jagielski
Ariel Herbert-Voss
...
Tom B. Brown
D. Song
Ulfar Erlingsson
Alina Oprea
Colin Raffel
MLAU
SILM
290
1,815
0
14 Dec 2020
1