ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2304.11832
  4. Cited By
Function-Consistent Feature Distillation

Function-Consistent Feature Distillation

24 April 2023
Dongyang Liu
Meina Kan
Shiguang Shan
Xilin Chen
ArXivPDFHTML

Papers citing "Function-Consistent Feature Distillation"

14 / 14 papers shown
Title
VRM: Knowledge Distillation via Virtual Relation Matching
VRM: Knowledge Distillation via Virtual Relation Matching
W. Zhang
Fei Xie
Weidong Cai
Chao Ma
76
0
0
28 Feb 2025
Wasserstein Distance Rivals Kullback-Leibler Divergence for Knowledge
  Distillation
Wasserstein Distance Rivals Kullback-Leibler Divergence for Knowledge Distillation
Jiaming Lv
Haoyuan Yang
P. Li
79
1
0
11 Dec 2024
TAS: Distilling Arbitrary Teacher and Student via a Hybrid Assistant
TAS: Distilling Arbitrary Teacher and Student via a Hybrid Assistant
Guopeng Li
Qiang Wang
K. Yan
Shouhong Ding
Yuan Gao
Gui-Song Xia
41
0
0
16 Oct 2024
Incorporating Clinical Guidelines through Adapting Multi-modal Large
  Language Model for Prostate Cancer PI-RADS Scoring
Incorporating Clinical Guidelines through Adapting Multi-modal Large Language Model for Prostate Cancer PI-RADS Scoring
Tiantian Zhang
Manxi Lin
Hongda Guo
Xiaofan Zhang
Ka Fung Peter Chiu
Aasa Feragen
Qi Dou
42
1
0
14 May 2024
Knowledge Distillation with Multi-granularity Mixture of Priors for
  Image Super-Resolution
Knowledge Distillation with Multi-granularity Mixture of Priors for Image Super-Resolution
Simiao Li
Yun-feng Zhang
Wei Li
Hanting Chen
Wenjia Wang
Bingyi Jing
Shaohui Lin
Jie Hu
SupR
40
1
0
03 Apr 2024
Self-supervised Video Object Segmentation with Distillation Learning of
  Deformable Attention
Self-supervised Video Object Segmentation with Distillation Learning of Deformable Attention
Quang-Trung Truong
Duc Thanh Nguyen
Binh-Son Hua
Sai-Kit Yeung
VOS
34
1
0
25 Jan 2024
Maximizing Discrimination Capability of Knowledge Distillation with
  Energy Function
Maximizing Discrimination Capability of Knowledge Distillation with Energy Function
Seonghak Kim
Gyeongdo Ham
Suin Lee
Donggon Jang
Daeshik Kim
34
4
0
24 Nov 2023
Leveraging Vision-Language Models for Improving Domain Generalization in
  Image Classification
Leveraging Vision-Language Models for Improving Domain Generalization in Image Classification
Sravanti Addepalli
Ashish Ramayee Asokan
Lakshay Sharma
R. V. Babu
VLM
24
15
0
12 Oct 2023
LumiNet: The Bright Side of Perceptual Knowledge Distillation
LumiNet: The Bright Side of Perceptual Knowledge Distillation
Md. Ismail Hossain
M. M. L. Elahi
Sameera Ramasinghe
A. Cheraghian
Fuad Rahman
Nabeel Mohammed
Shafin Rahman
31
1
0
05 Oct 2023
CrossKD: Cross-Head Knowledge Distillation for Object Detection
CrossKD: Cross-Head Knowledge Distillation for Object Detection
Jiabao Wang
Yuming Chen
Zhaohui Zheng
Xiang Li
Ming-Ming Cheng
Qibin Hou
40
32
0
20 Jun 2023
VanillaKD: Revisit the Power of Vanilla Knowledge Distillation from
  Small Scale to Large Scale
VanillaKD: Revisit the Power of Vanilla Knowledge Distillation from Small Scale to Large Scale
Zhiwei Hao
Jianyuan Guo
Kai Han
Han Hu
Chang Xu
Yunhe Wang
38
16
0
25 May 2023
Distilling Knowledge via Knowledge Review
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu Liu
Hengshuang Zhao
Jiaya Jia
155
422
0
19 Apr 2021
Show, Attend and Distill:Knowledge Distillation via Attention-based
  Feature Matching
Show, Attend and Distill:Knowledge Distillation via Attention-based Feature Matching
Mingi Ji
Byeongho Heo
Sungrae Park
65
143
0
05 Feb 2021
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision
  Applications
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
950
20,572
0
17 Apr 2017
1