Search Results for author: Masato Neishi

Found 3 papers, 1 papers with code

On the Relation between Position Information and Sentence Length in Neural Machine Translation

no code implementations CONLL 2019 Masato Neishi, Naoki Yoshinaga

Although some approaches such as the attention mechanism have partially remedied the problem, we found that the current standard NMT model, Transformer, has difficulty in translating long sentences compared to the former standard, Recurrent Neural Network (RNN)-based model.

Machine Translation NMT +4

Cannot find the paper you are looking for? You can Submit a new open access paper.