Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2203.04668
Cited By
Towards Inadequately Pre-trained Models in Transfer Learning
9 March 2022
Andong Deng
Xingjian Li
Di Hu
Tianyang Wang
Haoyi Xiong
Chengzhong Xu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Towards Inadequately Pre-trained Models in Transfer Learning"
7 / 7 papers shown
Title
Why pre-training is beneficial for downstream classification tasks?
Xin Jiang
Xu Cheng
Zechao Li
34
0
0
11 Oct 2024
Estimating Environmental Cost Throughout Model's Adaptive Life Cycle
Vishwesh Sangarya
Richard M. Bradford
Jung-Eun Kim
26
2
0
23 Jul 2024
Encourage or Inhibit Monosemanticity? Revisit Monosemanticity from a Feature Decorrelation Perspective
Hanqi Yan
Yanzheng Xiang
Guangyi Chen
Yifei Wang
Lin Gui
Yulan He
44
5
0
25 Jun 2024
BLIP: Bootstrapping Language-Image Pre-training for Unified Vision-Language Understanding and Generation
Junnan Li
Dongxu Li
Caiming Xiong
Guosheng Lin
MLLM
BDL
VLM
CLIP
392
4,154
0
28 Jan 2022
Instance Localization for Self-supervised Detection Pretraining
Ceyuan Yang
Zhirong Wu
Bolei Zhou
Stephen Lin
ViT
SSL
100
145
0
16 Feb 2021
Improved Baselines with Momentum Contrastive Learning
Xinlei Chen
Haoqi Fan
Ross B. Girshick
Kaiming He
SSL
279
3,375
0
09 Mar 2020
Transferability and Hardness of Supervised Classification Tasks
Anh Tran
Cuong V Nguyen
Tal Hassner
134
164
0
21 Aug 2019
1