ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1904.01769
  4. Cited By
M2KD: Multi-model and Multi-level Knowledge Distillation for Incremental
  Learning

M2KD: Multi-model and Multi-level Knowledge Distillation for Incremental Learning

3 April 2019
Peng Zhou
Long Mai
Jianming Zhang
N. Xu
Zuxuan Wu
L. Davis
    CLL
    VLM
ArXivPDFHTML

Papers citing "M2KD: Multi-model and Multi-level Knowledge Distillation for Incremental Learning"

18 / 18 papers shown
Title
Realistic Continual Learning Approach using Pre-trained Models
Realistic Continual Learning Approach using Pre-trained Models
Nadia Nasri
Carlos Gutiérrez-Álvarez
Sergio Lafuente-Arroyo
Saturnino Maldonado-Bascón
Roberto J. López-Sastre
CLL
43
0
0
11 Apr 2024
Privacy-Preserving Synthetic Continual Semantic Segmentation for Robotic
  Surgery
Privacy-Preserving Synthetic Continual Semantic Segmentation for Robotic Surgery
Mengya Xu
Mobarakol Islam
Long Bai
Hongliang Ren
33
5
0
08 Feb 2024
Teacher-Student Architecture for Knowledge Distillation: A Survey
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
21
16
0
08 Aug 2023
Neural Collapse Terminus: A Unified Solution for Class Incremental
  Learning and Its Variants
Neural Collapse Terminus: A Unified Solution for Class Incremental Learning and Its Variants
Yibo Yang
Haobo Yuan
Hefei Ling
Jianlong Wu
Lefei Zhang
Zhouchen Lin
Philip Torr
Dacheng Tao
Guohao Li
CLL
31
8
0
03 Aug 2023
Online Continual Learning via the Knowledge Invariant and Spread-out
  Properties
Online Continual Learning via the Knowledge Invariant and Spread-out Properties
Ya-nan Han
Jian-wei Liu
CLL
38
7
0
02 Feb 2023
EPIK: Eliminating multi-model Pipelines with Knowledge-distillation
EPIK: Eliminating multi-model Pipelines with Knowledge-distillation
Bhavesh Laddagiri
Yash Raj
Anshuman Dash
18
0
0
27 Nov 2022
Teacher-Student Architecture for Knowledge Learning: A Survey
Teacher-Student Architecture for Knowledge Learning: A Survey
Chengming Hu
Xuan Li
Dan Liu
Xi Chen
Ju Wang
Xue Liu
27
35
0
28 Oct 2022
Generalized Knowledge Distillation via Relationship Matching
Generalized Knowledge Distillation via Relationship Matching
Han-Jia Ye
Su Lu
De-Chuan Zhan
FedML
22
20
0
04 May 2022
Memory Efficient Continual Learning with Transformers
Memory Efficient Continual Learning with Transformers
Beyza Ermis
Giovanni Zappella
Martin Wistuba
Aditya Rawal
Cédric Archambeau
CLL
34
43
0
09 Mar 2022
DyTox: Transformers for Continual Learning with DYnamic TOken eXpansion
DyTox: Transformers for Continual Learning with DYnamic TOken eXpansion
Arthur Douillard
Alexandre Ramé
Guillaume Couairon
Matthieu Cord
CLL
30
298
0
22 Nov 2021
DIODE: Dilatable Incremental Object Detection
DIODE: Dilatable Incremental Object Detection
Can Peng
Kun-li Zhao
Sam Maksoud
Tianren Wang
Brian C. Lovell
CLL
ObjD
27
9
0
12 Aug 2021
There is More than Meets the Eye: Self-Supervised Multi-Object Detection
  and Tracking with Sound by Distilling Multimodal Knowledge
There is More than Meets the Eye: Self-Supervised Multi-Object Detection and Tracking with Sound by Distilling Multimodal Knowledge
Francisco Rivera Valverde
Juana Valeria Hurtado
Abhinav Valada
26
72
0
01 Mar 2021
Initial Classifier Weights Replay for Memoryless Class Incremental
  Learning
Initial Classifier Weights Replay for Memoryless Class Incremental Learning
Eden Belouadah
Adrian Daniel Popescu
Ioannis Kanellos
CLL
26
23
0
31 Aug 2020
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,851
0
09 Jun 2020
PODNet: Pooled Outputs Distillation for Small-Tasks Incremental Learning
PODNet: Pooled Outputs Distillation for Small-Tasks Incremental Learning
Arthur Douillard
Matthieu Cord
Charles Ollion
Thomas Robert
Eduardo Valle
CLL
14
6
0
28 Apr 2020
ScaIL: Classifier Weights Scaling for Class Incremental Learning
ScaIL: Classifier Weights Scaling for Class Incremental Learning
Eden Belouadah
Adrian Daniel Popescu
CLL
22
78
0
16 Jan 2020
Maintaining Discrimination and Fairness in Class Incremental Learning
Maintaining Discrimination and Fairness in Class Incremental Learning
Bowen Zhao
Xi Xiao
Guojun Gan
Bin Zhang
Shutao Xia
CLL
23
416
0
16 Nov 2019
Knowledge Distillation for Incremental Learning in Semantic Segmentation
Knowledge Distillation for Incremental Learning in Semantic Segmentation
Umberto Michieli
Pietro Zanuttigh
CLL
VLM
28
98
0
08 Nov 2019
1