ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.14922
  4. Cited By
Practical Insights into Knowledge Distillation for Pre-Trained Models

Practical Insights into Knowledge Distillation for Pre-Trained Models

22 February 2024
Norah Alballa
Marco Canini
ArXivPDFHTML

Papers citing "Practical Insights into Knowledge Distillation for Pre-Trained Models"

4 / 4 papers shown
Title
Query-based Knowledge Transfer for Heterogeneous Learning Environments
Query-based Knowledge Transfer for Heterogeneous Learning Environments
Norah Alballa
Wenxuan Zhang
Ziquan Liu
A. Abdelmoniem
Mohamed Elhoseiny
Marco Canini
36
0
0
12 Apr 2025
InFiConD: Interactive No-code Fine-tuning with Concept-based Knowledge
  Distillation
InFiConD: Interactive No-code Fine-tuning with Concept-based Knowledge Distillation
Jinbin Huang
Wenbin He
Liang Gou
Liu Ren
Chris Bryan
47
0
0
25 Jun 2024
Federated Learning on Non-IID Data Silos: An Experimental Study
Federated Learning on Non-IID Data Silos: An Experimental Study
Q. Li
Yiqun Diao
Quan Chen
Bingsheng He
FedML
OOD
87
946
0
03 Feb 2021
Large scale distributed neural network training through online
  distillation
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
272
404
0
09 Apr 2018
1