Effective Approaches to Attention-based Neural Machine Translation

An attentional mechanism has been used in neural machine translation (NMT) to selectively focus on parts of the source sentence during translation. However, there has been little work exploring useful architectures for attention-based NMT. This paper examines two simple and effective classes of attentional mechanism: a global approach which always attends to all source words and a local one that only looks at a subset of source words at a time. We demonstrate the effectiveness of both approaches over the WMT translation tasks between English and German in both directions. Our attentional NMTs provide a boost of up to 5.0 BLEU points over non-attentional systems which already incorporate known techniques such as dropout. For the English to German direction, we have established new state-of-the-art results of 23.0 BLEU for WMT'14 and 25.9 BLEU for WMT'15. Our in-depth analysis sheds light on which architectures are best and we are first to assess attentional models using alignment error rates.
View on arXiv