Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2004.04418
Cited By
On Optimal Transformer Depth for Low-Resource Language Translation
9 April 2020
Elan Van Biljon
Arnu Pretorius
Julia Kreutzer
MoE
Re-assign community
ArXiv
PDF
HTML
Papers citing
"On Optimal Transformer Depth for Low-Resource Language Translation"
6 / 6 papers shown
Title
State Space Models for Extractive Summarization in Low Resource Scenarios
Nisrine Ait Khayi
45
0
0
24 Jan 2025
Human Evaluation of English--Irish Transformer-Based NMT
Séamus Lankford
Haithem Afli
Andy Way
45
10
0
04 Mar 2024
Small Batch Sizes Improve Training of Low-Resource Neural MT
Àlex R. Atrio
Andrei Popescu-Belis
35
6
0
20 Mar 2022
Optimizing Transformer for Low-Resource Neural Machine Translation
Ali Araabi
Christof Monz
VLM
41
78
0
04 Nov 2020
A Survey on Recent Approaches for Natural Language Processing in Low-Resource Scenarios
Michael A. Hedderich
Lukas Lange
Heike Adel
Jannik Strötgen
Dietrich Klakow
224
287
0
23 Oct 2020
Query-Key Normalization for Transformers
Alex Henry
Prudhvi Raj Dachapally
S. Pawar
Yuxuan Chen
17
77
0
08 Oct 2020
1