ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.06746
  4. Cited By
Periocular Embedding Learning with Consistent Knowledge Distillation
  from Face

Periocular Embedding Learning with Consistent Knowledge Distillation from Face

12 December 2020
Yoon Gyo Jung
Jaewoo Park
C. Low
Jacky Chen Long Chai
Leslie Ching Ow Tiong
Andrew Beng Jin Teoh
    CVBM
ArXivPDFHTML

Papers citing "Periocular Embedding Learning with Consistent Knowledge Distillation from Face"

4 / 4 papers shown
Title
Mask-invariant Face Recognition through Template-level Knowledge
  Distillation
Mask-invariant Face Recognition through Template-level Knowledge Distillation
Marco Huber
Fadi Boutros
Florian Kirchbuchner
Naser Damer
CVBM
34
31
0
10 Dec 2021
Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in
  Knowledge Distillation
Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation
Taehyeon Kim
Jaehoon Oh
Nakyil Kim
Sangwook Cho
Se-Young Yun
25
232
0
19 May 2021
Similarity-Preserving Knowledge Distillation
Similarity-Preserving Knowledge Distillation
Frederick Tung
Greg Mori
86
963
0
23 Jul 2019
Paying More Attention to Attention: Improving the Performance of
  Convolutional Neural Networks via Attention Transfer
Paying More Attention to Attention: Improving the Performance of Convolutional Neural Networks via Attention Transfer
Sergey Zagoruyko
N. Komodakis
92
2,561
0
12 Dec 2016
1