ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2104.14696
  4. Cited By
Spirit Distillation: A Model Compression Method with Multi-domain
  Knowledge Transfer

Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer

29 April 2021
Zhiyuan Wu
Yu-Gang Jiang
Minghao Zhao
Chupeng Cui
Zongmin Yang
Xinhui Xue
Hong Qi
    VLM
ArXivPDFHTML

Papers citing "Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer"

5 / 5 papers shown
Title
Knowledge Distillation in Federated Edge Learning: A Survey
Knowledge Distillation in Federated Edge Learning: A Survey
Zhiyuan Wu
Sheng Sun
Yuwei Wang
Min Liu
Xue Jiang
Runhan Li
Bo Gao
FedML
32
4
0
14 Jan 2023
FedICT: Federated Multi-task Distillation for Multi-access Edge Computing
FedICT: Federated Multi-task Distillation for Multi-access Edge Computing
Zhiyuan Wu
Sheng Sun
Yuwei Wang
Min Liu
Quyang Pan
Xue Jiang
Bo Gao
37
31
0
01 Jan 2023
Domain-incremental Cardiac Image Segmentation with Style-oriented Replay
  and Domain-sensitive Feature Whitening
Domain-incremental Cardiac Image Segmentation with Style-oriented Replay and Domain-sensitive Feature Whitening
Kang Li
Lequan Yu
Pheng-Ann Heng
CLL
24
22
0
09 Nov 2022
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Chelsea Finn
Pieter Abbeel
Sergey Levine
OOD
383
11,700
0
09 Mar 2017
SegNet: A Deep Convolutional Encoder-Decoder Architecture for Image
  Segmentation
SegNet: A Deep Convolutional Encoder-Decoder Architecture for Image Segmentation
Vijay Badrinarayanan
Alex Kendall
R. Cipolla
SSeg
446
15,645
0
02 Nov 2015
1