ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.08116
  4. Cited By
Triplet Loss for Knowledge Distillation

Triplet Loss for Knowledge Distillation

17 April 2020
Hideki Oki
Motoshi Abe
J. Miyao
Takio Kurita
ArXivPDFHTML

Papers citing "Triplet Loss for Knowledge Distillation"

2 / 2 papers shown
Title
TSCM: A Teacher-Student Model for Vision Place Recognition Using Cross-Metric Knowledge Distillation
TSCM: A Teacher-Student Model for Vision Place Recognition Using Cross-Metric Knowledge Distillation
Yehui Shen
Mingmin Liu
Huimin Lu
Xieyuanli Chen
41
1
0
02 Apr 2024
Generalized Knowledge Distillation via Relationship Matching
Generalized Knowledge Distillation via Relationship Matching
Han-Jia Ye
Su Lu
De-Chuan Zhan
FedML
24
20
0
04 May 2022
1