Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2011.01210
Cited By
Focus on the present: a regularization method for the ASR source-target attention layer
2 November 2020
Nanxin Chen
Piotr Żelasko
Jesús Villalba
Najim Dehak
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Focus on the present: a regularization method for the ASR source-target attention layer"
1 / 1 papers shown
Title
Relaxed Attention for Transformer Models
Timo Lohrenz
Björn Möller
Zhengyang Li
Tim Fingscheidt
KELM
29
11
0
20 Sep 2022
1