Learning to Rank Question-Answer Pairs using Hierarchical Recurrent Encoder with Latent Topic Clustering

NAACL 2018  ·  Seunghyun Yoon, Joongbo Shin, Kyomin Jung ·

In this paper, we propose a novel end-to-end neural architecture for ranking candidate answers, that adapts a hierarchical recurrent neural network and a latent topic clustering module. With our proposed model, a text is encoded to a vector representation from an word-level to a chunk-level to effectively capture the entire meaning. In particular, by adapting the hierarchical structure, our model shows very small performance degradations in longer text comprehension while other state-of-the-art recurrent neural network models suffer from it. Additionally, the latent topic clustering module extracts semantic information from target samples. This clustering module is useful for any text related tasks by allowing each data sample to find its nearest topic cluster, thus helping the neural network model analyze the entire data. We evaluate our models on the Ubuntu Dialogue Corpus and consumer electronic domain question answering dataset, which is related to Samsung products. The proposed model shows state-of-the-art results for ranking question-answer pairs.

PDF Abstract NAACL 2018 PDF NAACL 2018 Abstract

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Answer Selection Ubuntu Dialogue (v1, Ranking) HRDE-LTC 1 in 10 R@1 0.684 # 1
1 in 2 R@1 0.916 # 1
1 in 10 R@2 0.822 # 1
1 in 10 R@5 0.960 # 1
Answer Selection Ubuntu Dialogue (v2, Ranking) HRDE-LTC 1 in 10 R@1 0.652 # 2
1 in 10 R@2 0.815 # 1
1 in 10 R@5 0.966 # 1
1 in 2 R@1 0.915 # 1

Methods


No methods listed for this paper. Add relevant methods here