ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2009.14658
  4. Cited By
Learning Hard Retrieval Decoder Attention for Transformers

Learning Hard Retrieval Decoder Attention for Transformers

30 September 2020
Hongfei Xu
Qiuhui Liu
Josef van Genabith
Deyi Xiong
ArXivPDFHTML

Papers citing "Learning Hard Retrieval Decoder Attention for Transformers"

2 / 2 papers shown
Title
How Does Selective Mechanism Improve Self-Attention Networks?
How Does Selective Mechanism Improve Self-Attention Networks?
Xinwei Geng
Longyue Wang
Xing Wang
Bing Qin
Ting Liu
Zhaopeng Tu
AAML
39
35
0
03 May 2020
Effective Approaches to Attention-based Neural Machine Translation
Effective Approaches to Attention-based Neural Machine Translation
Thang Luong
Hieu H. Pham
Christopher D. Manning
218
7,929
0
17 Aug 2015
1