Infusing Sequential Information into Conditional Masked Translation Model with Self-Review Mechanism

19 Oct 2020 Pan Xie Zhi Cui Xiuyin Chen Xiaohui Hu Jianwei Cui Bin Wang

Non-autoregressive models generate target words in a parallel way, which achieve a faster decoding speed but at the sacrifice of translation accuracy. To remedy a flawed translation by non-autoregressive models, a promising approach is to train a conditional masked translation model (CMTM), and refine the generated results within several iterations... (read more)

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper