DialBERT: A Hierarchical Pre-Trained Model for Conversation Disentanglement

8 Apr 2020 Tianda Li Jia-Chen Gu Xiaodan Zhu Quan Liu Zhen-Hua Ling Zhiming Su Si Wei

Disentanglement is a problem in which multiple conversations occur in the same channel simultaneously, and the listener should decide which utterance is part of the conversation he will respond to. We propose a new model, named Dialogue BERT (DialBERT), which integrates local and global semantics in a single stream of messages to disentangle the conversations that mixed together... (read more)

PDF Abstract
No code implementations yet. Submit your code now


Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper