no code implementations • ACL 2019 • Shohei Iida, Ryuichiro Kimura, Hongyi Cui, Po-Hsuan Hung, Takehito Utsuro, Masaaki Nagata
The first hop attention is the scaled dot-product attention which is the same attention mechanism used in the original Transformer.
no code implementations • WS 2017 • Zi Long, Ryuichiro Kimura, Takehito Utsuro, Tomoharu Mitsuhashi, Mikio Yamamoto
Long et al.(2017) proposed to select phrases that contain out-of-vocabulary words using the statistical approach of branching entropy.
no code implementations • MTSummit 2017 • Zi Long, Ryuichiro Kimura, Takehito Utsuro, Tomoharu Mitsuhashi, Mikio Yamamoto
Neural machine translation (NMT), a new approach to machine translation, has achieved promising results comparable to those of traditional approaches such as statistical machine translation (SMT).