ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2203.08442
  4. Cited By
Understanding and Improving Sequence-to-Sequence Pretraining for Neural
  Machine Translation

Understanding and Improving Sequence-to-Sequence Pretraining for Neural Machine Translation

16 March 2022
Wenxuan Wang
Wenxiang Jiao
Yongchang Hao
Xing Wang
Shuming Shi
Zhaopeng Tu
Michael Lyu
    AIMat
ArXivPDFHTML

Papers citing "Understanding and Improving Sequence-to-Sequence Pretraining for Neural Machine Translation"

3 / 3 papers shown
Title
Transferable Adversarial Attacks on Vision Transformers with Token
  Gradient Regularization
Transferable Adversarial Attacks on Vision Transformers with Token Gradient Regularization
Jianping Zhang
Yizhan Huang
Weibin Wu
Michael R. Lyu
AAML
ViT
18
50
0
28 Mar 2023
Improving Neural Machine Translation by Denoising Training
Improving Neural Machine Translation by Denoising Training
Liang Ding
Keqin Peng
Dacheng Tao
VLM
AI4CE
41
6
0
19 Jan 2022
Understanding and Improving Lexical Choice in Non-Autoregressive
  Translation
Understanding and Improving Lexical Choice in Non-Autoregressive Translation
Liang Ding
Longyue Wang
Xuebo Liu
Derek F. Wong
Dacheng Tao
Zhaopeng Tu
112
77
0
29 Dec 2020
1