Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1904.01769
Cited By
M2KD: Multi-model and Multi-level Knowledge Distillation for Incremental Learning
3 April 2019
Peng Zhou
Long Mai
Jianming Zhang
N. Xu
Zuxuan Wu
L. Davis
CLL
VLM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"M2KD: Multi-model and Multi-level Knowledge Distillation for Incremental Learning"
18 / 18 papers shown
Title
Realistic Continual Learning Approach using Pre-trained Models
Nadia Nasri
Carlos Gutiérrez-Álvarez
Sergio Lafuente-Arroyo
Saturnino Maldonado-Bascón
Roberto J. López-Sastre
CLL
43
0
0
11 Apr 2024
Privacy-Preserving Synthetic Continual Semantic Segmentation for Robotic Surgery
Mengya Xu
Mobarakol Islam
Long Bai
Hongliang Ren
33
5
0
08 Feb 2024
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
21
16
0
08 Aug 2023
Neural Collapse Terminus: A Unified Solution for Class Incremental Learning and Its Variants
Yibo Yang
Haobo Yuan
Hefei Ling
Jianlong Wu
Lefei Zhang
Zhouchen Lin
Philip Torr
Dacheng Tao
Guohao Li
CLL
31
8
0
03 Aug 2023
Online Continual Learning via the Knowledge Invariant and Spread-out Properties
Ya-nan Han
Jian-wei Liu
CLL
38
7
0
02 Feb 2023
EPIK: Eliminating multi-model Pipelines with Knowledge-distillation
Bhavesh Laddagiri
Yash Raj
Anshuman Dash
16
0
0
27 Nov 2022
Teacher-Student Architecture for Knowledge Learning: A Survey
Chengming Hu
Xuan Li
Dan Liu
Xi Chen
Ju Wang
Xue Liu
20
35
0
28 Oct 2022
Generalized Knowledge Distillation via Relationship Matching
Han-Jia Ye
Su Lu
De-Chuan Zhan
FedML
22
20
0
04 May 2022
Memory Efficient Continual Learning with Transformers
Beyza Ermis
Giovanni Zappella
Martin Wistuba
Aditya Rawal
Cédric Archambeau
CLL
29
43
0
09 Mar 2022
DyTox: Transformers for Continual Learning with DYnamic TOken eXpansion
Arthur Douillard
Alexandre Ramé
Guillaume Couairon
Matthieu Cord
CLL
30
295
0
22 Nov 2021
DIODE: Dilatable Incremental Object Detection
Can Peng
Kun-li Zhao
Sam Maksoud
Tianren Wang
Brian C. Lovell
CLL
ObjD
27
9
0
12 Aug 2021
There is More than Meets the Eye: Self-Supervised Multi-Object Detection and Tracking with Sound by Distilling Multimodal Knowledge
Francisco Rivera Valverde
Juana Valeria Hurtado
Abhinav Valada
26
72
0
01 Mar 2021
Initial Classifier Weights Replay for Memoryless Class Incremental Learning
Eden Belouadah
Adrian Daniel Popescu
Ioannis Kanellos
CLL
21
23
0
31 Aug 2020
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,851
0
09 Jun 2020
PODNet: Pooled Outputs Distillation for Small-Tasks Incremental Learning
Arthur Douillard
Matthieu Cord
Charles Ollion
Thomas Robert
Eduardo Valle
CLL
9
6
0
28 Apr 2020
ScaIL: Classifier Weights Scaling for Class Incremental Learning
Eden Belouadah
Adrian Daniel Popescu
CLL
22
78
0
16 Jan 2020
Maintaining Discrimination and Fairness in Class Incremental Learning
Bowen Zhao
Xi Xiao
Guojun Gan
Bin Zhang
Shutao Xia
CLL
23
416
0
16 Nov 2019
Knowledge Distillation for Incremental Learning in Semantic Segmentation
Umberto Michieli
Pietro Zanuttigh
CLL
VLM
28
98
0
08 Nov 2019
1