no code implementations • INLG (ACL) 2020 • Yumi Hamazono, Yui Uehara, Hiroshi Noji, Yusuke Miyao, Hiroya Takamura, Ichiro Kobayashi
On top of this, we employ a copy mechanism that is suitable for referring to the content of data records in the market price data.
no code implementations • INLG (ACL) 2021 • Tatsuya Ishigaki, Goran Topic, Yumi Hamazono, Hiroshi Noji, Ichiro Kobayashi, Yusuke Miyao, Hiroya Takamura
In this study, we introduce a new large-scale dataset that contains aligned video data, structured numerical data, and transcribed commentaries that consist of 129, 226 utterances in 1, 389 races in a game.
no code implementations • spnlp (ACL) 2022 • Shunsuke Kando, Hiroshi Noji, Yusuke Miyao
On average, the performance of our best model represents a 19 \% increase in accuracy over the worst choice across all languages.
2 code implementations • EMNLP 2021 • Ryo Yoshida, Hiroshi Noji, Yohei Oseki
In computational linguistics, it has been shown that hierarchical structures make language models (LMs) more human-like.
1 code implementation • Findings (ACL) 2021 • Hiroshi Noji, Yohei Oseki
However, RNNGs are known to be harder to scale due to the difficulty of batched training.
1 code implementation • COLING 2020 • Namgi Han, Goran Topic, Hiroshi Noji, Hiroya Takamura, Yusuke Miyao
Our analysis, including shifting of training and test datasets and training on a union of the datasets, suggests that our progress in solving SimpleQuestions dataset does not indicate the success of more general simple question answering.
1 code implementation • COLING 2020 • Yui Uehara, Tatsuya Ishigaki, Kasumi Aoki, Hiroshi Noji, Keiichi Goshima, Ichiro Kobayashi, Hiroya Takamura, Yusuke Miyao
Existing models for data-to-text tasks generate fluent but sometimes incorrect sentences e. g., {``}Nikkei gains{''} is generated when {``}Nikkei drops{''} is expected.
2 code implementations • COLING 2020 • Hicham El Boukkouri, Olivier Ferret, Thomas Lavergne, Hiroshi Noji, Pierre Zweigenbaum, Junichi Tsujii
Due to the compelling improvements brought by BERT, many recent representation models adopted the Transformer architecture as their main building block, consequently inheriting the wordpiece tokenization system despite it not being intrinsically linked to the notion of Transformers.
Ranked #1 on Semantic Similarity on ClinicalSTS
Clinical Concept Extraction Drug–drug Interaction Extraction +3
1 code implementation • ACL 2020 • Hiroshi Noji, Hiroya Takamura
Neural language models are commonly trained only on positive examples, a set of sentences in the training data, but recent studies suggest that the models trained in this way are not capable of robustly handling complex syntactic constructions, such as long-distance agreement.
no code implementations • WS 2019 • Kasumi Aoki, Akira Miyazawa, Tatsuya Ishigaki, Tatsuya Aoki, Hiroshi Noji, Keiichi Goshima, Ichiro Kobayashi, Hiroya Takamura, Yusuke Miyao
We propose a data-to-document generator that can easily control the contents of output texts based on a neural language model.
2 code implementations • ACL 2019 • Hayate Iso, Yui Uehara, Tatsuya Ishigaki, Hiroshi Noji, Eiji Aramaki, Ichiro Kobayashi, Yusuke Miyao, Naoaki Okazaki, Hiroya Takamura
We propose a data-to-text generation model with two modules, one for tracking and the other for text generation.
no code implementations • ACL 2019 • Masashi Yoshikawa, Hiroshi Noji, Koji Mineshima, Daisuke Bekki
We propose a new domain adaptation method for Combinatory Categorial Grammar (CCG) parsing, based on the idea of automatic generation of CCG corpora exploiting cheaper resources of dependency trees.
1 code implementation • 15 Nov 2018 • Masashi Yoshikawa, Koji Mineshima, Hiroshi Noji, Daisuke Bekki
In logic-based approaches to reasoning tasks such as Recognizing Textual Entailment (RTE), it is important for a system to have a large amount of knowledge data.
no code implementations • COLING 2018 • Ryosuke Kohita, Hiroshi Noji, Yuji Matsumoto
One main challenge for incremental transition-based parsers, when future inputs are invisible, is to extract good features from a limited local context.
no code implementations • COLING 2018 • Quy Nguyen, Yusuke Miyao, Hiroshi Noji, Nhung Nguyen
Syntactic parsing plays a crucial role in improving the quality of natural language processing tasks.
no code implementations • NAACL 2018 • Masashi Yoshikawa, Koji Mineshima, Hiroshi Noji, Daisuke Bekki
In formal logic-based approaches to Recognizing Textual Entailment (RTE), a Combinatory Categorial Grammar (CCG) parser is used to parse input premises and hypotheses to obtain their logical formulas.
no code implementations • IJCNLP 2017 • Frances Yung, Hiroshi Noji, Yuji Matsumoto
Humans process language word by word and construct partial linguistic structures on the fly before the end of the sentence is perceived.
no code implementations • WS 2017 • Ryosuke Kohita, Hiroshi Noji, Yuji Matsumoto
We present a new transition system with word reordering for unrestricted non-projective dependency parsing.
no code implementations • CONLL 2017 • Motoki Sato, Hitoshi Manabe, Hiroshi Noji, Yuji Matsumoto
We describe our submission to the CoNLL 2017 shared task, which exploits the shared common knowledge of a language across different domains via a domain adaptation technique.
1 code implementation • ACL 2017 • Masashi Yoshikawa, Hiroshi Noji, Yuji Matsumoto
Our model achieves the state-of-the-art results on English and Japanese CCG parsing.
1 code implementation • EACL 2017 • Ryosuke Kohita, Hiroshi Noji, Yuji Matsumoto
Universal Dependencies (UD) is becoming a standard annotation scheme cross-linguistically, but it is argued that this scheme centering on content words is harder to parse than the conventional one centering on function words.
no code implementations • 1 Aug 2016 • Hiroshi Noji
This connection suggests left-corner methods can be a tool to exploit the universal syntactic constraint that people avoid generating center-embedded structures.