Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2106.01489
Cited By
Not All Knowledge Is Created Equal: Mutual Distillation of Confident Knowledge
2 June 2021
Ziyun Li
Xinshao Wang
Diane Hu
N. Robertson
David Clifton
Christoph Meinel
Haojin Yang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Not All Knowledge Is Created Equal: Mutual Distillation of Confident Knowledge"
2 / 2 papers shown
Title
Emergent Specialization: Rare Token Neurons in Language Models
Jing Liu
Haozheng Wang
Yueheng Li
MILM
LRM
23
0
0
19 May 2025
Data Selection for Efficient Model Update in Federated Learning
Hongrui Shi
Valentin Radu
FedML
39
5
0
05 Nov 2021
1