Story Ending Generation with Incremental Encoding and Commonsense Knowledge

30 Aug 2018  ·  Jian Guan, Yansen Wang, Minlie Huang ·

Generating a reasonable ending for a given story context, i.e., story ending generation, is a strong indication of story comprehension. This task requires not only to understand the context clues which play an important role in planning the plot but also to handle implicit knowledge to make a reasonable, coherent story. In this paper, we devise a novel model for story ending generation. The model adopts an incremental encoding scheme to represent context clues which are spanning in the story context. In addition, commonsense knowledge is applied through multi-source attention to facilitate story comprehension, and thus to help generate coherent and reasonable endings. Through building context clues and using implicit knowledge, the model is able to produce reasonable story endings. context clues implied in the post and make the inference based on it. Automatic and manual evaluation shows that our model can generate more reasonable story endings than state-of-the-art baselines.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Image-guided Story Ending Generation VIST-E IE+MSA BLEU-1 19.15 # 3
BLEU-2 5.74 # 4
BLEU-3 2.73 # 5
BLEU-4 1.63 # 5
METEOR 6.59 # 4
CIDEr 15.56 # 3
ROUGE-L 20.62 # 3

Methods


No methods listed for this paper. Add relevant methods here