ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.14174
  4. Cited By
Only 5\% Attention Is All You Need: Efficient Long-range Document-level
  Neural Machine Translation

Only 5\% Attention Is All You Need: Efficient Long-range Document-level Neural Machine Translation

25 September 2023
Zihan Liu
Zewei Sun
Shanbo Cheng
Shujian Huang
Mingxuan Wang
ArXivPDFHTML

Papers citing "Only 5\% Attention Is All You Need: Efficient Long-range Document-level Neural Machine Translation"

3 / 3 papers shown
Title
Investigating Length Issues in Document-level Machine Translation
Investigating Length Issues in Document-level Machine Translation
Ziqian Peng
Rachel Bawden
François Yvon
69
1
0
23 Dec 2024
Big Bird: Transformers for Longer Sequences
Big Bird: Transformers for Longer Sequences
Manzil Zaheer
Guru Guruganesh
Kumar Avinava Dubey
Joshua Ainslie
Chris Alberti
...
Philip Pham
Anirudh Ravula
Qifan Wang
Li Yang
Amr Ahmed
VLM
288
2,017
0
28 Jul 2020
Efficient Content-Based Sparse Attention with Routing Transformers
Efficient Content-Based Sparse Attention with Routing Transformers
Aurko Roy
M. Saffar
Ashish Vaswani
David Grangier
MoE
252
580
0
12 Mar 2020
1