Applications of BERT Based Sequence Tagging Models on Chinese Medical Text Attributes Extraction

22 Aug 2020 Gang Zhao Teng Zhang Chenxiao Wang Ping Lv Ji Wu

We convert the Chinese medical text attributes extraction task into a sequence tagging or machine reading comprehension task. Based on BERT pre-trained models, we have not only tried the widely used LSTM-CRF sequence tagging model, but also other sequence models, such as CNN, UCNN, WaveNet, SelfAttention, etc, which reaches similar performance as LSTM+CRF... (read more)

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper