Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2308.13772
Cited By
Boosting Residual Networks with Group Knowledge
26 August 2023
Shengji Tang
Peng Ye
Baopu Li
Wei Lin
Tao Chen
Tong He
Chong Yu
Wanli Ouyang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Boosting Residual Networks with Group Knowledge"
9 / 9 papers shown
Title
S2HPruner: Soft-to-Hard Distillation Bridges the Discretization Gap in Pruning
Weihao Lin
Shengji Tang
Chong Yu
Peng Ye
Tao Chen
16
0
0
09 Oct 2024
Enhanced Sparsification via Stimulative Training
Shengji Tang
Weihao Lin
Hancheng Ye
Peng Ye
Chong Yu
Baopu Li
Tao Chen
32
2
0
11 Mar 2024
Partial Fine-Tuning: A Successor to Full Fine-Tuning for Vision Transformers
Peng Ye
Yongqi Huang
Chongjun Tu
Minglei Li
Tao Chen
Tong He
Wanli Ouyang
30
4
0
25 Dec 2023
Stimulative Training++: Go Beyond The Performance Limits of Residual Networks
XinYu Piao
Tong He
DoangJoo Synn
Baopu Li
Tao Chen
Lei Bai
Jong-Kook Kim
44
4
0
04 May 2023
Asymmetric Temperature Scaling Makes Larger Networks Teach Well Again
Xin-Chun Li
Wenxuan Fan
Shaoming Song
Yinchuan Li
Bingshuai Li
Yunfeng Shao
De-Chuan Zhan
36
30
0
10 Oct 2022
Stimulative Training of Residual Networks: A Social Psychology Perspective of Loafing
Peng Ye
Shengji Tang
Baopu Li
Tao Chen
Wanli Ouyang
29
13
0
09 Oct 2022
MLP-Mixer: An all-MLP Architecture for Vision
Ilya O. Tolstikhin
N. Houlsby
Alexander Kolesnikov
Lucas Beyer
Xiaohua Zhai
...
Andreas Steiner
Daniel Keysers
Jakob Uszkoreit
Mario Lucic
Alexey Dosovitskiy
271
2,603
0
04 May 2021
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
192
473
0
12 Jun 2018
Densely Connected Convolutional Networks
Gao Huang
Zhuang Liu
L. V. D. van der Maaten
Kilian Q. Weinberger
PINN
3DV
249
36,362
0
25 Aug 2016
1