Positional Encoding to Control Output Sequence Length

NAACL 2019  ·  Sho Takase, Naoaki Okazaki ·

Neural encoder-decoder models have been successful in natural language generation tasks. However, real applications of abstractive summarization must consider additional constraint that a generated summary should not exceed a desired length. In this paper, we propose a simple but effective extension of a sinusoidal positional encoding (Vaswani et al., 2017) to enable neural encoder-decoder model to preserves the length constraint. Unlike in previous studies where that learn embeddings representing each length, the proposed method can generate a text of any length even if the target length is not present in training data. The experimental results show that the proposed method can not only control the generation length but also improve the ROUGE scores.

PDF Abstract NAACL 2019 PDF NAACL 2019 Abstract

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Text Summarization DUC 2004 Task 1 Transformer+LRPE+PE+Re-ranking+Ensemble ROUGE-1 32.85 # 2
ROUGE-2 11.78 # 2
ROUGE-L 28.52 # 1

Methods


No methods listed for this paper. Add relevant methods here