Effective Approaches to Attention-based Neural Machine Translation

EMNLP 2015 Minh-Thang LuongHieu PhamChristopher D. Manning

An attentional mechanism has lately been used to improve neural machine translation (NMT) by selectively focusing on parts of the source sentence during translation. However, there has been little work exploring useful architectures for attention-based NMT... (read more)

PDF Abstract

Evaluation results from the paper

 SOTA for Machine Translation on 20NEWS (Accuracy metric )

     Get a GitHub badge
Task Dataset Model Metric name Metric value Global rank Compare
Machine Translation 20NEWS 12 Accuracy 1 # 1