Search Results for author: Hiroyuki Deguchi

Found 5 papers, 1 papers with code

Centroid-Based Efficient Minimum Bayes Risk Decoding

no code implementations17 Feb 2024 Hiroyuki Deguchi, Yusuke Sakai, Hidetaka Kamigaito, Taro Watanabe, Hideki Tanaka, Masao Utiyama

Minimum Bayes risk (MBR) decoding achieved state-of-the-art translation performance by using COMET, a neural metric that has a high correlation with human evaluation.

Translation

knn-seq: Efficient, Extensible kNN-MT Framework

1 code implementation18 Oct 2023 Hiroyuki Deguchi, Hayate Hirano, Tomoki Hoshino, Yuto Nishida, Justin Vasselli, Taro Watanabe

We publish our knn-seq as an MIT-licensed open-source project and the code is available on https://github. com/naist-nlp/knn-seq .

Machine Translation NMT +1

Synchronous Syntactic Attention for Transformer Neural Machine Translation

no code implementations ACL 2021 Hiroyuki Deguchi, Akihiro Tamura, Takashi Ninomiya

This paper proposes a novel attention mechanism for Transformer Neural Machine Translation, {``}Synchronous Syntactic Attention,{''} inspired by synchronous dependency grammars.

Machine Translation Translation

Bilingual Subword Segmentation for Neural Machine Translation

no code implementations COLING 2020 Hiroyuki Deguchi, Masao Utiyama, Akihiro Tamura, Takashi Ninomiya, Eiichiro Sumita

This paper proposed a new subword segmentation method for neural machine translation, {``}Bilingual Subword Segmentation,{''} which tokenizes sentences to minimize the difference between the number of subword units in a sentence and that of its translation.

Machine Translation Segmentation +2

Dependency-Based Self-Attention for Transformer NMT

no code implementations RANLP 2019 Hiroyuki Deguchi, Akihiro Tamura, Takashi Ninomiya

In this paper, we propose a new Transformer neural machine translation (NMT) model that incorporates dependency relations into self-attention on both source and target sides, dependency-based self-attention.

Machine Translation NMT +2

Cannot find the paper you are looking for? You can Submit a new open access paper.