ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1810.12546
  4. Cited By
Simplifying Neural Machine Translation with Addition-Subtraction
  Twin-Gated Recurrent Networks

Simplifying Neural Machine Translation with Addition-Subtraction Twin-Gated Recurrent Networks

30 October 2018
Biao Zhang
Deyi Xiong
Jinsong Su
Qian Lin
Huiji Zhang
ArXivPDFHTML

Papers citing "Simplifying Neural Machine Translation with Addition-Subtraction Twin-Gated Recurrent Networks"

4 / 4 papers shown
Title
A Survey of Deep Learning Techniques for Neural Machine Translation
A Survey of Deep Learning Techniques for Neural Machine Translation
Shu Yang
Yuxin Wang
X. Chu
VLM
AI4TS
AI4CE
17
138
0
18 Feb 2020
A Lightweight Recurrent Network for Sequence Modeling
A Lightweight Recurrent Network for Sequence Modeling
Biao Zhang
Rico Sennrich
27
7
0
30 May 2019
Google's Neural Machine Translation System: Bridging the Gap between
  Human and Machine Translation
Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation
Yonghui Wu
M. Schuster
Z. Chen
Quoc V. Le
Mohammad Norouzi
...
Alex Rudnick
Oriol Vinyals
G. Corrado
Macduff Hughes
J. Dean
AIMat
716
6,743
0
26 Sep 2016
Effective Approaches to Attention-based Neural Machine Translation
Effective Approaches to Attention-based Neural Machine Translation
Thang Luong
Hieu H. Pham
Christopher D. Manning
218
7,923
0
17 Aug 2015
1