ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2011.00678
  4. Cited By
Investigating Catastrophic Forgetting During Continual Training for
  Neural Machine Translation

Investigating Catastrophic Forgetting During Continual Training for Neural Machine Translation

2 November 2020
Shuhao Gu
Yang Feng
    CLL
ArXivPDFHTML

Papers citing "Investigating Catastrophic Forgetting During Continual Training for Neural Machine Translation"

4 / 4 papers shown
Title
Continual Learning of Neural Machine Translation within Low Forgetting
  Risk Regions
Continual Learning of Neural Machine Translation within Low Forgetting Risk Regions
Shuhao Gu
Bojie Hu
Yang Feng
CLL
38
12
0
03 Nov 2022
Cross-Attention is All You Need: Adapting Pretrained Transformers for
  Machine Translation
Cross-Attention is All You Need: Adapting Pretrained Transformers for Machine Translation
Mozhdeh Gheini
Xiang Ren
Jonathan May
LRM
25
105
0
18 Apr 2021
Domain Adaptation and Multi-Domain Adaptation for Neural Machine
  Translation: A Survey
Domain Adaptation and Multi-Domain Adaptation for Neural Machine Translation: A Survey
Danielle Saunders
AI4CE
25
85
0
14 Apr 2021
Pruning-then-Expanding Model for Domain Adaptation of Neural Machine
  Translation
Pruning-then-Expanding Model for Domain Adaptation of Neural Machine Translation
Shuhao Gu
Yang Feng
Wanying Xie
CLL
AI4CE
25
27
0
25 Mar 2021
1