ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2304.10295
  4. Cited By
Decouple Non-parametric Knowledge Distillation For End-to-end Speech
  Translation

Decouple Non-parametric Knowledge Distillation For End-to-end Speech Translation

20 April 2023
Hao Zhang
Nianwen Si
Yaqi Chen
Wenlin Zhang
Xukui Yang
Dan Qu
Zhen Li
ArXiv (abs)PDFHTML

Papers citing "Decouple Non-parametric Knowledge Distillation For End-to-end Speech Translation"

2 / 2 papers shown
Title
Retrieval and Distill: A Temporal Data Shift-Free Paradigm for Online
  Recommendation System
Retrieval and Distill: A Temporal Data Shift-Free Paradigm for Online Recommendation System
Lei Zheng
Ning Li
Weinan Zhang
Yong Yu
AI4TS
79
0
0
24 Apr 2024
Billion-scale similarity search with GPUs
Billion-scale similarity search with GPUs
Jeff Johnson
Matthijs Douze
Hervé Jégou
389
3,753
0
28 Feb 2017
1