Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2402.14922
Cited By
Practical Insights into Knowledge Distillation for Pre-Trained Models
22 February 2024
Norah Alballa
Marco Canini
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Practical Insights into Knowledge Distillation for Pre-Trained Models"
4 / 4 papers shown
Title
Query-based Knowledge Transfer for Heterogeneous Learning Environments
Norah Alballa
Wenxuan Zhang
Ziquan Liu
A. Abdelmoniem
Mohamed Elhoseiny
Marco Canini
36
0
0
12 Apr 2025
InFiConD: Interactive No-code Fine-tuning with Concept-based Knowledge Distillation
Jinbin Huang
Wenbin He
Liang Gou
Liu Ren
Chris Bryan
47
0
0
25 Jun 2024
Federated Learning on Non-IID Data Silos: An Experimental Study
Q. Li
Yiqun Diao
Quan Chen
Bingsheng He
FedML
OOD
87
946
0
03 Feb 2021
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
272
404
0
09 Apr 2018
1