Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2212.08349
Cited By
Swing Distillation: A Privacy-Preserving Knowledge Distillation Framework
16 December 2022
Junzhuo Li
Xinwei Wu
Weilong Dong
Shuangzhi Wu
Chao Bian
Deyi Xiong
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Swing Distillation: A Privacy-Preserving Knowledge Distillation Framework"
6 / 6 papers shown
Title
A Survey of What to Share in Federated Learning: Perspectives on Model Utility, Privacy Leakage, and Communication Efficiency
Jiawei Shao
Zijian Li
Wenqiang Sun
Tailin Zhou
Yuchang Sun
Lumin Liu
Zehong Lin
Yuyi Mao
Jun Zhang
FedML
43
23
0
20 Jul 2023
Revisiting Label Smoothing and Knowledge Distillation Compatibility: What was Missing?
Keshigeyan Chandrasegaran
Ngoc-Trung Tran
Yunqing Zhao
Ngai-man Cheung
86
41
0
29 Jun 2022
Memorization in NLP Fine-tuning Methods
Fatemehsadat Mireshghallah
Archit Uniyal
Tianhao Wang
David E. Evans
Taylor Berg-Kirkpatrick
AAML
61
39
0
25 May 2022
On a Utilitarian Approach to Privacy Preserving Text Generation
Zekun Xu
Abhinav Aggarwal
Oluwaseyi Feyisetan
Nathanael Teissier
34
24
0
23 Apr 2021
Extracting Training Data from Large Language Models
Nicholas Carlini
Florian Tramèr
Eric Wallace
Matthew Jagielski
Ariel Herbert-Voss
...
Tom B. Brown
D. Song
Ulfar Erlingsson
Alina Oprea
Colin Raffel
MLAU
SILM
290
1,815
0
14 Dec 2020
Teaching Machines to Read and Comprehend
Karl Moritz Hermann
Tomás Kociský
Edward Grefenstette
L. Espeholt
W. Kay
Mustafa Suleyman
Phil Blunsom
181
3,510
0
10 Jun 2015
1