TransferTransfo: A Transfer Learning Approach for Neural Network Based Conversational Agents

23 Jan 2019  ·  Thomas Wolf, Victor Sanh, Julien Chaumond, Clement Delangue ·

We introduce a new approach to generative data-driven dialogue systems (e.g. chatbots) called TransferTransfo which is a combination of a Transfer learning based training scheme and a high-capacity Transformer model. Fine-tuning is performed by using a multi-task objective which combines several unsupervised prediction tasks. The resulting fine-tuned model shows strong improvements over the current state-of-the-art end-to-end conversational models like memory augmented seq2seq and information-retrieval models. On the privately held PERSONA-CHAT dataset of the Conversational Intelligence Challenge 2, this approach obtains a new state-of-the-art, with respective perplexity, Hits@1 and F1 metrics of 16.28 (45 % absolute improvement), 80.7 (46 % absolute improvement) and 19.5 (20 % absolute improvement).

PDF Abstract

Results from the Paper


Ranked #3 on Dialogue Generation on Persona-Chat (using extra training data)

     Get a GitHub badge
Task Dataset Model Metric Name Metric Value Global Rank Uses Extra
Training Data
Result Benchmark
Dialogue Generation Persona-Chat TransferTransfo Avg F1 19.09 # 3

Methods