Text generation is the task of generating text with the goal of appearing indistinguishable to human-written text.
|Trend||Dataset||Best Method||Paper title||Paper||Code||Compare|
Additionally, these models are typically trained via maxi- mum likelihood and teacher forcing.
#3 best model for Multivariate Time Series Imputation on Basketball Players Movement
This paper shows how Long Short-term Memory recurrent neural networks can be used to generate complex sequences with long-range structure, simply by predicting one data point at a time.
fairseq is an open-source sequence modeling toolkit that allows researchers and developers to train custom models for translation, summarization, language modeling, and other text generation tasks.
We observe that our method consistently outperforms BS and previously proposed techniques for diverse decoding from neural sequence models.
In this work, we introduce a model and beam-search training scheme, based on the work of Daume III and Marcu (2005), that extends seq2seq to learn global sequence scores.
#16 best model for Machine Translation on IWSLT2015 German-English
As a new way of training generative models, Generative Adversarial Nets (GAN) that uses a discriminative model to guide the training of the generative model has enjoyed considerable success in generating real-valued data.
#2 best model for Text Generation on EMNLP2017 WMT
End-to-end models for goal-orientated dialogue are challenging to train, because linguistic and strategic aspects are entangled in latent state vectors.