no code implementations • EMNLP 2020 • Baiyun Cui, Yingming Li, Zhongfei Zhang
In this paper, we introduce a novel BERT-enhanced Relational Sentence Ordering Network (referred to as BRSON) by leveraging BERT for capturing better dependency relationship among sentences to enhance the coherence modeling for the entire paragraph.
no code implementations • IJCNLP 2019 • Baiyun Cui, Yingming Li, Ming Chen, Zhongfei Zhang
In this paper, we develop a novel Sparse Self-Attention Fine-tuning model (referred as SSAF) which integrates sparsity into self-attention mechanism to enhance the fine-tuning performance of BERT.
no code implementations • EMNLP 2018 • Baiyun Cui, Yingming Li, Ming Chen, Zhongfei Zhang
In this paper, we propose a novel deep attentive sentence ordering network (referred as ATTOrderNet) which integrates self-attention mechanism with LSTMs in the encoding of input sentences.
1 code implementation • 21 Oct 2017 • Baiyun Cui, Yingming Li, Yaqing Zhang, Zhongfei Zhang
In this paper, we propose a novel deep coherence model (DCM) using a convolutional neural network architecture to capture the text coherence.