Search Results for author: Toshihiro Takeda

Found 1 papers, 1 papers with code

Pre-training technique to localize medical BERT and enhance biomedical BERT

1 code implementation14 May 2020 Shoya Wada, Toshihiro Takeda, Shiro Manabe, Shozo Konishi, Jun Kamohara, Yasushi Matsumura

We confirmed that our Japanese medical BERT outperformed conventional baselines and the other BERT models in terms of the medical document classification task and that our English BERT pre-trained using both the general and medical-domain corpora performed sufficiently well for practical use in terms of the biomedical language understanding evaluation (BLUE) benchmark.

Document Classification Transfer Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.