ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2407.06504
  4. Cited By
Reprogramming Distillation for Medical Foundation Models

Reprogramming Distillation for Medical Foundation Models

9 July 2024
Yuhang Zhou
Siyuan Du
Haolin Li
Jiangchao Yao
Ya Zhang
Yanfeng Wang
ArXivPDFHTML

Papers citing "Reprogramming Distillation for Medical Foundation Models"

6 / 6 papers shown
Title
MeLo: Low-rank Adaptation is Better than Fine-tuning for Medical Image
  Diagnosis
MeLo: Low-rank Adaptation is Better than Fine-tuning for Medical Image Diagnosis
Yitao Zhu
Zhenrong Shen
Zihao Zhao
Sheng Wang
Xin Wang
Xiangyu Zhao
Dinggang Shen
Qian Wang
MedIm
37
28
0
14 Nov 2023
NORM: Knowledge Distillation via N-to-One Representation Matching
NORM: Knowledge Distillation via N-to-One Representation Matching
Xiaolong Liu
Lujun Li
Chao Li
Anbang Yao
55
68
0
23 May 2023
LPT: Long-tailed Prompt Tuning for Image Classification
LPT: Long-tailed Prompt Tuning for Image Classification
Bowen Dong
Pan Zhou
Shuicheng Yan
W. Zuo
VPVLM
VLM
62
54
0
03 Oct 2022
AdaptFormer: Adapting Vision Transformers for Scalable Visual
  Recognition
AdaptFormer: Adapting Vision Transformers for Scalable Visual Recognition
Shoufa Chen
Chongjian Ge
Zhan Tong
Jiangliu Wang
Yibing Song
Jue Wang
Ping Luo
152
638
0
26 May 2022
Model Reprogramming: Resource-Efficient Cross-Domain Machine Learning
Model Reprogramming: Resource-Efficient Cross-Domain Machine Learning
Pin-Yu Chen
VLM
109
58
0
22 Feb 2022
Masked Autoencoders Are Scalable Vision Learners
Masked Autoencoders Are Scalable Vision Learners
Kaiming He
Xinlei Chen
Saining Xie
Yanghao Li
Piotr Dollár
Ross B. Girshick
ViT
TPM
308
7,443
0
11 Nov 2021
1