ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.07324
  4. Cited By
Building a Multi-domain Neural Machine Translation Model using Knowledge
  Distillation

Building a Multi-domain Neural Machine Translation Model using Knowledge Distillation

15 April 2020
Idriss Mghabbar
Pirashanth Ratnamogan
ArXivPDFHTML

Papers citing "Building a Multi-domain Neural Machine Translation Model using Knowledge Distillation"

7 / 7 papers shown
Title
Too Brittle To Touch: Comparing the Stability of Quantization and
  Distillation Towards Developing Lightweight Low-Resource MT Models
Too Brittle To Touch: Comparing the Stability of Quantization and Distillation Towards Developing Lightweight Low-Resource MT Models
Harshita Diddee
Sandipan Dandapat
Monojit Choudhury
T. Ganu
Kalika Bali
31
5
0
27 Oct 2022
Revisiting Label Smoothing and Knowledge Distillation Compatibility:
  What was Missing?
Revisiting Label Smoothing and Knowledge Distillation Compatibility: What was Missing?
Keshigeyan Chandrasegaran
Ngoc-Trung Tran
Yunqing Zhao
Ngai-man Cheung
86
41
0
29 Jun 2022
SocialBERT -- Transformers for Online SocialNetwork Language Modelling
SocialBERT -- Transformers for Online SocialNetwork Language Modelling
I. Karpov
Nick Kartashev
24
3
0
13 Nov 2021
Domain Adaptation and Multi-Domain Adaptation for Neural Machine
  Translation: A Survey
Domain Adaptation and Multi-Domain Adaptation for Neural Machine Translation: A Survey
Danielle Saunders
AI4CE
25
85
0
14 Apr 2021
*-CFQ: Analyzing the Scalability of Machine Learning on a Compositional
  Task
*-CFQ: Analyzing the Scalability of Machine Learning on a Compositional Task
Dmitry Tsarkov
Tibor Tihon
Nathan Scales
Nikola Momchev
Danila Sinopalnikov
Nathanael Scharli
18
17
0
15 Dec 2020
OpenNMT: Open-Source Toolkit for Neural Machine Translation
OpenNMT: Open-Source Toolkit for Neural Machine Translation
Guillaume Klein
Yoon Kim
Yuntian Deng
Jean Senellart
Alexander M. Rush
273
1,896
0
10 Jan 2017
Effective Approaches to Attention-based Neural Machine Translation
Effective Approaches to Attention-based Neural Machine Translation
Thang Luong
Hieu H. Pham
Christopher D. Manning
218
7,926
0
17 Aug 2015
1