|TREND||DATASET||BEST METHOD||PAPER TITLE||PAPER||CODE||COMPARE|
Summarization based on text extraction is inherently limited, but generation-style abstractive methods have proven challenging to build.
Ranked #9 on Text Summarization on DUC 2004 Task 1
In this work, we model abstractive text summarization using Attentional Encoder-Decoder Recurrent Neural Networks, and show that they achieve state-of-the-art performance on two different corpora.
Ranked #8 on Text Summarization on DUC 2004 Task 1
Recurrent neural network models with an attention mechanism have proven to be extremely effective on a wide variety of sequence-to-sequence problems.
Ranked #17 on Speech Recognition on TIMIT
We propose a selective encoding model to extend the sequence-to-sequence framework for abstractive sentence summarization.
Ranked #6 on Text Summarization on DUC 2004 Task 1
But there is no cross-lingual parallel corpus, whose source sentence language is different to the summary language, to directly train a cross-lingual ASSUM system.
We propose a contrastive attention mechanism to extend the sequence-to-sequence framework for abstractive sentence summarization task, which aims to generate a brief summary of a given source sentence.