Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2307.01381
Cited By
Implicit Memory Transformer for Computationally Efficient Simultaneous Speech Translation
3 July 2023
Matthew Raffel
Lizhong Chen
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Implicit Memory Transformer for Computationally Efficient Simultaneous Speech Translation"
7 / 7 papers shown
Title
SimulMT to SimulST: Adapting Simultaneous Text Translation to End-to-End Simultaneous Speech Translation
Xutai Ma
J. Pino
Philipp Koehn
52
97
0
03 Nov 2020
Streaming Simultaneous Speech Translation with Augmented Memory Transformer
Xutai Ma
Yongqiang Wang
M. Dousti
Philipp Koehn
J. Pino
47
39
0
30 Oct 2020
SimulEval: An Evaluation Toolkit for Simultaneous Translation
Xutai Ma
M. Dousti
Changhan Wang
Jiatao Gu
J. Pino
59
106
0
31 Jul 2020
fairseq: A Fast, Extensible Toolkit for Sequence Modeling
Myle Ott
Sergey Edunov
Alexei Baevski
Angela Fan
Sam Gross
Nathan Ng
David Grangier
Michael Auli
VLM
FaML
99
3,150
0
01 Apr 2019
Self-Attention Aligner: A Latency-Control End-to-End Model for ASR Using Self-Attention Network and Chunk-Hopping
Linhao Dong
Feng Wang
Bo Xu
47
91
0
18 Feb 2019
Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context
Zihang Dai
Zhilin Yang
Yiming Yang
J. Carbonell
Quoc V. Le
Ruslan Salakhutdinov
VLM
216
3,726
0
09 Jan 2019
A Call for Clarity in Reporting BLEU Scores
Matt Post
138
2,981
0
23 Apr 2018
1