no code implementations • 17 Feb 2024 • Hiroyuki Deguchi, Yusuke Sakai, Hidetaka Kamigaito, Taro Watanabe, Hideki Tanaka, Masao Utiyama
Minimum Bayes risk (MBR) decoding achieved state-of-the-art translation performance by using COMET, a neural metric that has a high correlation with human evaluation.
1 code implementation • 18 Oct 2023 • Hiroyuki Deguchi, Hayate Hirano, Tomoki Hoshino, Yuto Nishida, Justin Vasselli, Taro Watanabe
We publish our knn-seq as an MIT-licensed open-source project and the code is available on https://github. com/naist-nlp/knn-seq .
no code implementations • ACL 2021 • Hiroyuki Deguchi, Akihiro Tamura, Takashi Ninomiya
This paper proposes a novel attention mechanism for Transformer Neural Machine Translation, {``}Synchronous Syntactic Attention,{''} inspired by synchronous dependency grammars.
no code implementations • COLING 2020 • Hiroyuki Deguchi, Masao Utiyama, Akihiro Tamura, Takashi Ninomiya, Eiichiro Sumita
This paper proposed a new subword segmentation method for neural machine translation, {``}Bilingual Subword Segmentation,{''} which tokenizes sentences to minimize the difference between the number of subword units in a sentence and that of its translation.
no code implementations • RANLP 2019 • Hiroyuki Deguchi, Akihiro Tamura, Takashi Ninomiya
In this paper, we propose a new Transformer neural machine translation (NMT) model that incorporates dependency relations into self-attention on both source and target sides, dependency-based self-attention.