Capturing Long-range Contextual Dependencies with Memory-enhanced Conditional Random Fields

IJCNLP 2017  ·  Fei Liu, Timothy Baldwin, Trevor Cohn ·

Despite successful applications across a broad range of NLP tasks, conditional random fields ("CRFs"), in particular the linear-chain variant, are only able to model local features. While this has important benefits in terms of inference tractability, it limits the ability of the model to capture long-range dependencies between items. Attempts to extend CRFs to capture long-range dependencies have largely come at the cost of computational complexity and approximate inference. In this work, we propose an extension to CRFs by integrating external memory, taking inspiration from memory networks, thereby allowing CRFs to incorporate information far beyond neighbouring steps. Experiments across two tasks show substantial improvements over strong CRF and LSTM baselines.

PDF Abstract IJCNLP 2017 PDF IJCNLP 2017 Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods