no code implementations • Findings (EMNLP) 2021 • Satoshi Akasaki, Naoki Yoshinaga, Masashi Toyoda
Experiments on the Twitter datasets confirm the effectiveness of our typing model and the context selector.
1 code implementation • Findings (EMNLP) 2021 • Shoetsu Sato, Naoki Yoshinaga, Masashi Toyoda, Masaru Kitsuregawa
Our method chooses the most probable one from redundantly sampled latent variables for tying up the variable with a given response.
no code implementations • EMNLP (BlackboxNLP) 2021 • Daisuke Oba, Naoki Yoshinaga, Masashi Toyoda
Probing classifiers have been extensively used to inspect whether a model component captures specific linguistic phenomena.
no code implementations • 4 Jan 2024 • Yuma Tsuta, Naoki Yoshinaga, Shoetsu Sato, Masashi Toyoda
Open-domain dialogue systems have started to engage in continuous conversations with humans.
no code implementations • 14 Sep 2023 • Daisuke Oba, Naoki Yoshinaga, Masashi Toyoda
The meanings of words and phrases depend not only on where they are used (contexts) but also on who use them (writers).
no code implementations • 13 Oct 2022 • Satoshi Akasaki, Naoki Yoshinaga, Masashi Toyoda
The major challenge is detecting uncertain contexts of disappearing entities from noisy microblog posts.
1 code implementation • Findings of the Association for Computational Linguistics 2020 • Shoetsu Sato, Jin Sakuma, Naoki Yoshinaga, Masashi Toyoda, Masaru Kitsuregawa
Prior to fine-tuning, our method replaces the embedding layers of the NMT model by projecting general word embeddings induced from monolingual data in a target domain onto a source-domain embedding space.
no code implementations • EMNLP (NLP-COVID19) 2020 • Akiko Aizawa, Frederic Bergeron, Junjie Chen, Fei Cheng, Katsuhiko Hayashi, Kentaro Inui, Hiroyoshi Ito, Daisuke Kawahara, Masaru Kitsuregawa, Hirokazu Kiyomaru, Masaki Kobayashi, Takashi Kodama, Sadao Kurohashi, Qianying Liu, Masaki Matsubara, Yusuke Miyao, Atsuyuki Morishima, Yugo Murawaki, Kazumasa Omura, Haiyue Song, Eiichiro Sumita, Shinji Suzuki, Ribeka Tanaka, Yu Tanaka, Masashi Toyoda, Nobuhiro Ueda, Honai Ueoka, Masao Utiyama, Ying Zhong
The global pandemic of COVID-19 has made the public pay close attention to related news, covering various domains, such as sanitation, treatment, and effects on education.
no code implementations • ACL 2020 • Tsuta Yuma, Naoki Yoshinaga, Masashi Toyoda
Experimental results on massive Twitter data confirmed that υBLEU is comparable to ΔBLEU in terms of its correlation with human judgment and that the state of the art automatic evaluation method, RUBER, is improved by integrating υBLEU.
no code implementations • 30 Apr 2020 • Shoetsu Sato, Jin Sakuma, Naoki Yoshinaga, Masashi Toyoda, Masaru Kitsuregawa
Prior to fine-tuning, our method replaces the embedding layers of the NMT model by projecting general word embeddings induced from monolingual data in a target domain onto a source-domain embedding space.
no code implementations • 8 Jul 2019 • Satoshi Akasaki, Naoki Yoshinaga, Masashi Toyoda
Keeping up to date on emerging entities that appear every day is indispensable for various applications, such as social-trend analysis and marketing research.
no code implementations • NAACL 2019 • Shonosuke Ishiwatari, Hiroaki Hayashi, Naoki Yoshinaga, Graham Neubig, Shoetsu Sato, Masashi Toyoda, Masaru Kitsuregawa
When reading a text, it is common to become stuck on unfamiliar words and phrases, such as polysemous words with novel senses, rarely used idioms, internet slang, or emerging entities.
no code implementations • NAACL 2019 • Daisuke Oba, Naoki Yoshinaga, Shoetsu Sato, Satoshi Akasaki, Masashi Toyoda
In this study, we propose a method of modeling such personal biases in word meanings (hereafter, semantic variations) with personalized word embeddings obtained by solving a task on subjective text while regarding words used by different individuals as different words.
1 code implementation • 1 Nov 2018 • Shonosuke Ishiwatari, Hiroaki Hayashi, Naoki Yoshinaga, Graham Neubig, Shoetsu Sato, Masashi Toyoda, Masaru Kitsuregawa
When reading a text, it is common to become stuck on unfamiliar words and phrases, such as polysemous words with novel senses, rarely used idioms, internet slang, or emerging entities.
1 code implementation • WS 2017 • Masato Neishi, Jin Sakuma, Satoshi Tohda, Shonosuke Ishiwatari, Naoki Yoshinaga, Masashi Toyoda
In this paper, we describe the team UT-IIS{'}s system and results for the WAT 2017 translation tasks.
no code implementations • COLING 2016 • Tatsuya Iwanari, Kohei Ohara, Naoki Yoshinaga, Nobuhiro Kaji, Masashi Toyoda, Masaru Kitsuregawa
Kotonush, a system that clarifies people{'}s values on various concepts on the basis of what they write about on social media, is presented.