Search Results for author: Taiki Watanabe

Found 4 papers, 0 papers with code

Relation Extraction Using Multiple Pre-Training Models in Biomedical Domain

no code implementations RANLP 2021 Satoshi Hiai, Kazutaka Shimada, Taiki Watanabe, Akiva Miura, Tomoya Iwakura

In addition, our method shows approximately three times faster extraction speed than the BERT-based models on the ChemProt corpus and reduces the memory size to one sixth of the BERT ones.

Relation Relation Extraction

Multi-Task Learning for Chemical Named Entity Recognition with Chemical Compound Paraphrasing

no code implementations IJCNLP 2019 Taiki Watanabe, Akihiro Tamura, Takashi Ninomiya, Takuya Makino, Tomoya Iwakura

We propose a method to improve named entity recognition (NER) for chemical compounds using multi-task learning by jointly training a chemical NER model and a chemical com- pound paraphrase model.

Multi-Task Learning named-entity-recognition +2

CKY-based Convolutional Attention for Neural Machine Translation

no code implementations IJCNLP 2017 Taiki Watanabe, Akihiro Tamura, Takashi Ninomiya

This paper proposes a new attention mechanism for neural machine translation (NMT) based on convolutional neural networks (CNNs), which is inspired by the CKY algorithm.

Machine Translation NMT +2

Cannot find the paper you are looking for? You can Submit a new open access paper.