Improving the Transformer Translation Model with Document-Level Context

EMNLP 2018 Jiacheng ZhangHuanbo LuanMaosong SunFeiFei ZhaiJingfang XuMin ZhangYang Liu

Although the Transformer translation model (Vaswani et al., 2017) has achieved state-of-the-art performance in a variety of translation tasks, how to use document-level context to deal with discourse phenomena problematic for Transformer still remains a challenge. In this work, we extend the Transformer model with a new context encoder to represent document-level context, which is then incorporated into the original encoder and decoder... (read more)

PDF Abstract EMNLP 2018 PDF EMNLP 2018 Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper