DialBERT: A Hierarchical Pre-Trained Model for Conversation Disentanglement

Disentanglement is a problem in which multiple conversations occur in the same channel simultaneously, and the listener should decide which utterance is part of the conversation he will respond to. We propose a new model, named Dialogue BERT (DialBERT), which integrates local and global semantics in a single stream of messages to disentangle the conversations that mixed together... (read more)

Results in Papers With Code
(↓ scroll down to see all results)