ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2008.10850
  4. Cited By
Discriminability Distillation in Group Representation Learning

Discriminability Distillation in Group Representation Learning

25 August 2020
Manyuan Zhang
Guanglu Song
Hang Zhou
Yu Liu
    FedML
ArXivPDFHTML

Papers citing "Discriminability Distillation in Group Representation Learning"

4 / 4 papers shown
Title
Revisiting Label Smoothing and Knowledge Distillation Compatibility:
  What was Missing?
Revisiting Label Smoothing and Knowledge Distillation Compatibility: What was Missing?
Keshigeyan Chandrasegaran
Ngoc-Trung Tran
Yunqing Zhao
Ngai-man Cheung
83
41
0
29 Jun 2022
Refining Pseudo Labels with Clustering Consensus over Generations for
  Unsupervised Object Re-identification
Refining Pseudo Labels with Clustering Consensus over Generations for Unsupervised Object Re-identification
Xiao Zhang
Yixiao Ge
Yu Qiao
Hongsheng Li
30
96
0
11 Jun 2021
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,837
0
09 Jun 2020
Deeply learned face representations are sparse, selective, and robust
Deeply learned face representations are sparse, selective, and robust
Yi Sun
Xiaogang Wang
Xiaoou Tang
CVBM
250
921
0
03 Dec 2014
1