ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2303.09230
  4. Cited By
Towards a Smaller Student: Capacity Dynamic Distillation for Efficient
  Image Retrieval

Towards a Smaller Student: Capacity Dynamic Distillation for Efficient Image Retrieval

16 March 2023
Yi Xie
Huaidong Zhang
Xuemiao Xu
Jianqing Zhu
Shengfeng He
    VLM
ArXivPDFHTML

Papers citing "Towards a Smaller Student: Capacity Dynamic Distillation for Efficient Image Retrieval"

4 / 4 papers shown
Title
Rethinking Multi-view Representation Learning via Distilled
  Disentangling
Rethinking Multi-view Representation Learning via Distilled Disentangling
Guanzhou Ke
Bo Wang
Xiaoli Wang
Shengfeng He
37
3
0
16 Mar 2024
Distilling Knowledge via Knowledge Review
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu Liu
Hengshuang Zhao
Jiaya Jia
155
422
0
19 Apr 2021
RepVGG: Making VGG-style ConvNets Great Again
RepVGG: Making VGG-style ConvNets Great Again
Xiaohan Ding
Xinming Zhang
Ningning Ma
Jungong Han
Guiguang Ding
Jian Sun
136
1,548
0
11 Jan 2021
Smooth-AP: Smoothing the Path Towards Large-Scale Image Retrieval
Smooth-AP: Smoothing the Path Towards Large-Scale Image Retrieval
A. Brown
Weidi Xie
Vicky Kalogeiton
Andrew Zisserman
47
163
0
23 Jul 2020
1