no code implementations • 10 Apr 2024 • Xincan Feng, Akifumi Yoshimoto
Recent advancements in Natural Language Processing (NLP) have seen Large-scale Language Models (LLMs) excel at producing high-quality text for various purposes.
no code implementations • IJCNLP 2017 • An Nguyen Le, Ander Martinez, Akifumi Yoshimoto, Yuji Matsumoto
In order to assess the performance, we construct model based on an attention mechanism encoder-decoder model in which the source language is input to the encoder as a sequence and the decoder generates the target language as a linearized dependency tree structure.
no code implementations • WS 2016 • Ayaka Morimoto, Akifumi Yoshimoto, Akihiko Kato, Hiroyuki Shindo, Yuji Matsumoto
This paper presents our ongoing work on compilation of English multi-word expression (MWE) lexicon.