ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2204.00548
  4. Cited By
Unified and Effective Ensemble Knowledge Distillation

Unified and Effective Ensemble Knowledge Distillation

1 April 2022
Chuhan Wu
Fangzhao Wu
Tao Qi
Yongfeng Huang
    FedML
ArXivPDFHTML

Papers citing "Unified and Effective Ensemble Knowledge Distillation"

4 / 4 papers shown
Title
PROD: Progressive Distillation for Dense Retrieval
PROD: Progressive Distillation for Dense Retrieval
Zhenghao Lin
Yeyun Gong
Xiao Liu
Hang Zhang
Chen Lin
...
Jian Jiao
Jing Lu
Daxin Jiang
Rangan Majumder
Nan Duan
51
27
0
27 Sep 2022
Improving model calibration with accuracy versus uncertainty
  optimization
Improving model calibration with accuracy versus uncertainty optimization
R. Krishnan
Omesh Tickoo
UQCV
188
157
0
14 Dec 2020
Knowledge Distillation by On-the-Fly Native Ensemble
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
197
473
0
12 Jun 2018
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,959
0
20 Apr 2018
1