Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2104.14696
Cited By
Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer
29 April 2021
Zhiyuan Wu
Yu-Gang Jiang
Minghao Zhao
Chupeng Cui
Zongmin Yang
Xinhui Xue
Hong Qi
VLM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer"
5 / 5 papers shown
Title
Knowledge Distillation in Federated Edge Learning: A Survey
Zhiyuan Wu
Sheng Sun
Yuwei Wang
Min Liu
Xue Jiang
Runhan Li
Bo Gao
FedML
29
4
0
14 Jan 2023
FedICT: Federated Multi-task Distillation for Multi-access Edge Computing
Zhiyuan Wu
Sheng Sun
Yuwei Wang
Min Liu
Quyang Pan
Xue Jiang
Bo Gao
37
31
0
01 Jan 2023
Domain-incremental Cardiac Image Segmentation with Style-oriented Replay and Domain-sensitive Feature Whitening
Kang Li
Lequan Yu
Pheng-Ann Heng
CLL
24
22
0
09 Nov 2022
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Chelsea Finn
Pieter Abbeel
Sergey Levine
OOD
365
11,700
0
09 Mar 2017
SegNet: A Deep Convolutional Encoder-Decoder Architecture for Image Segmentation
Vijay Badrinarayanan
Alex Kendall
R. Cipolla
SSeg
446
15,645
0
02 Nov 2015
1