Search Results for author: Lisheng Gao

Found 2 papers, 0 papers with code

Application of Pre-training Models in Named Entity Recognition

no code implementations9 Feb 2020 Yu Wang, Yining Sun, Zuchang Ma, Lisheng Gao, Yang Xu, Ting Sun

Then, we apply these pre-training models to a NER task by fine-tuning, and compare the effects of the different model architecture and pre-training tasks on the NER task.

named-entity-recognition Named Entity Recognition +1

Deep Restricted Boltzmann Networks

no code implementations15 Nov 2016 Hengyuan Hu, Lisheng Gao, Quanbin Ma

The most famous ones among them are deep belief network, which stacks multiple layer-wise pretrained RBMs to form a hybrid model, and deep Boltzmann machine, which allows connections between hidden units to form a multi-layer structure.

Cannot find the paper you are looking for? You can Submit a new open access paper.