Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2010.10090
Cited By
Knowledge Distillation in Wide Neural Networks: Risk Bound, Data Efficiency and Imperfect Teacher
20 October 2020
Guangda Ji
Zhanxing Zhu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Knowledge Distillation in Wide Neural Networks: Risk Bound, Data Efficiency and Imperfect Teacher"
11 / 11 papers shown
Title
LiteLMGuard: Seamless and Lightweight On-Device Prompt Filtering for Safeguarding Small Language Models against Quantization-induced Risks and Vulnerabilities
Kalyan Nakka
Jimmy Dani
Ausmit Mondal
Nitesh Saxena
AAML
30
0
0
08 May 2025
Data Duplication: A Novel Multi-Purpose Attack Paradigm in Machine Unlearning
Dayong Ye
Tainqing Zhu
J. Li
Kun Gao
B. Liu
L. Zhang
Wanlei Zhou
Y. Zhang
AAML
MU
80
0
0
28 Jan 2025
The Effect of Optimal Self-Distillation in Noisy Gaussian Mixture Model
Kaito Takanami
Takashi Takahashi
Ayaka Sakata
35
0
0
27 Jan 2025
Provable Weak-to-Strong Generalization via Benign Overfitting
David X. Wu
A. Sahai
65
6
0
06 Oct 2024
Is On-Device AI Broken and Exploitable? Assessing the Trust and Ethics in Small Language Models
Kalyan Nakka
Jimmy Dani
Nitesh Saxena
43
1
0
08 Jun 2024
Frameless Graph Knowledge Distillation
Dai Shi
Zhiqi Shao
Yi Guo
Junbin Gao
34
4
0
13 Jul 2023
Do Not Blindly Imitate the Teacher: Using Perturbed Loss for Knowledge Distillation
Rongzhi Zhang
Jiaming Shen
Tianqi Liu
Jia-Ling Liu
Michael Bendersky
Marc Najork
Chao Zhang
48
18
0
08 May 2023
Supervision Complexity and its Role in Knowledge Distillation
Hrayr Harutyunyan
A. S. Rawat
A. Menon
Seungyeon Kim
Surinder Kumar
22
12
0
28 Jan 2023
Linkless Link Prediction via Relational Distillation
Zhichun Guo
William Shiao
Shichang Zhang
Yozen Liu
Nitesh V. Chawla
Neil Shah
Tong Zhao
21
41
0
11 Oct 2022
Informed Learning by Wide Neural Networks: Convergence, Generalization and Sampling Complexity
Jianyi Yang
Shaolei Ren
24
3
0
02 Jul 2022
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,835
0
09 Jun 2020
1