ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1905.10777
  4. Cited By
Cross-Resolution Face Recognition via Prior-Aided Face Hallucination and
  Residual Knowledge Distillation

Cross-Resolution Face Recognition via Prior-Aided Face Hallucination and Residual Knowledge Distillation

26 May 2019
Hanyang Kong
Jian-jun Zhao
X. Tu
Junliang Xing
Shengmei Shen
Jiashi Feng
    SupR
    CVBM
ArXivPDFHTML

Papers citing "Cross-Resolution Face Recognition via Prior-Aided Face Hallucination and Residual Knowledge Distillation"

6 / 6 papers shown
Title
Look One and More: Distilling Hybrid Order Relational Knowledge for
  Cross-Resolution Image Recognition
Look One and More: Distilling Hybrid Order Relational Knowledge for Cross-Resolution Image Recognition
Shiming Ge
Kangkai Zhang
Haolin Liu
Yingying Hua
Shengwei Zhao
Xin Jin
Hao Wen
36
24
0
09 Sep 2024
Gradient Knowledge Distillation for Pre-trained Language Models
Gradient Knowledge Distillation for Pre-trained Language Models
Lean Wang
Lei Li
Xu Sun
VLM
28
5
0
02 Nov 2022
Teaching Where to Look: Attention Similarity Knowledge Distillation for
  Low Resolution Face Recognition
Teaching Where to Look: Attention Similarity Knowledge Distillation for Low Resolution Face Recognition
Sungho Shin
Joosoon Lee
Junseok Lee
Yeonguk Yu
Kyoobin Lee
CVBM
24
32
0
29 Sep 2022
Image-to-Video Generation via 3D Facial Dynamics
Image-to-Video Generation via 3D Facial Dynamics
X. Tu
Yingtian Zou
Jian-jun Zhao
W. Ai
Jian Dong
...
Zhikang Wang
Guodong Guo
Zhifeng Li
Wei Liu
Jiashi Feng
CVBM
3DH
23
44
0
31 May 2021
Joint Face Image Restoration and Frontalization for Recognition
Joint Face Image Restoration and Frontalization for Recognition
X. Tu
Jian-jun Zhao
Qiankun Liu
W. Ai
G. Guo
Zhifeng Li
Wei Liu
Jiashi Feng
CVBM
24
54
0
12 May 2021
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
28
2,857
0
09 Jun 2020
1