ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2412.16493
  4. Cited By
Cross-View Consistency Regularisation for Knowledge Distillation

Cross-View Consistency Regularisation for Knowledge Distillation

21 December 2024
W. Zhang
Dongnan Liu
Weidong Cai
Chao Ma
ArXivPDFHTML

Papers citing "Cross-View Consistency Regularisation for Knowledge Distillation"

2 / 2 papers shown
Title
DeepKD: A Deeply Decoupled and Denoised Knowledge Distillation Trainer
DeepKD: A Deeply Decoupled and Denoised Knowledge Distillation Trainer
Haiduo Huang
Jiangcheng Song
Yadong Zhang
Pengju Ren
20
0
0
21 May 2025
VRM: Knowledge Distillation via Virtual Relation Matching
VRM: Knowledge Distillation via Virtual Relation Matching
W. Zhang
Fei Xie
Weidong Cai
Chao Ma
81
0
0
28 Feb 2025
1