no code implementations • NAACL 2021 • Zhen Ke, Liang Shi, Songtao Sun, Erli Meng, Bin Wang, Xipeng Qiu
Recent researches show that pre-trained models (PTMs) are beneficial to Chinese Word Segmentation (CWS).
no code implementations • 13 Apr 2020 • Zhen Ke, Liang Shi, Erli Meng, Bin Wang, Xipeng Qiu, Xuanjing Huang
Besides, the pre-trained BERT language model has been also introduced into the MCCWS task in a multi-task learning framework.