ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2003.01474
  4. Cited By
Anytime Inference with Distilled Hierarchical Neural Ensembles

Anytime Inference with Distilled Hierarchical Neural Ensembles

3 March 2020
Adria Ruiz
Jakob Verbeek
    UQCV
    BDL
    FedML
ArXivPDFHTML

Papers citing "Anytime Inference with Distilled Hierarchical Neural Ensembles"

3 / 3 papers shown
Title
DICE: Diversity in Deep Ensembles via Conditional Redundancy Adversarial
  Estimation
DICE: Diversity in Deep Ensembles via Conditional Redundancy Adversarial Estimation
Alexandre Ramé
Matthieu Cord
FedML
45
51
0
14 Jan 2021
Knowledge Distillation by On-the-Fly Native Ensemble
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
192
473
0
12 Jun 2018
Large scale distributed neural network training through online
  distillation
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
275
404
0
09 Apr 2018
1