no code implementations • ICML 2020 • Feng Wei
Our proposed unsupervised language model is trained over unlabelled corpus, and then the pre-trained language model is capable of abstracting the surrounding context of polyseme instances in labeled corpus into context embeddings.
no code implementations • 7 Jan 2022 • Feng Wei, Zhenbo Chen, Zhenghong Hao, Fengxin Yang, Hua Wei, Bing Han, Sheng Guo
To make DCSC fully utilize the limited known intents, we propose a two-stage training procedure for DCSC, in which DCSC will be trained on both labeled samples and unlabeled samples, and achieve better text representation and clustering performance.
no code implementations • 3 Feb 2020 • Feng Wei, Uyen Trang Nguyen
Twitter is a web application playing dual roles of online social networking and micro-blogging.
no code implementations • 30 Jul 2019 • Feng Wei, Uyen Trang Nguyen, Hui Jiang
Our neural linking models consist of three parts: a PageRank based candidate generation module, a dual-FOFE-net neural ranking model and a simple NIL entity clustering system.