MaskGAN: Better Text Generation via Filling in the______

23 Jan 2018William FedusIan GoodfellowAndrew M. Dai

Neural text generation models are often autoregressive language models or seq2seq models. These models generate text by sampling words sequentially, with each word conditioned on the previous word, and are state-of-the-art for several machine translation and summarization benchmarks... (read more)

PDF Abstract
TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT BENCHMARK
Multivariate Time Series Imputation Basketball Players Movement MaskGAN Path Length 0.793 # 3
OOB Rate (10^−3) 4.592 # 4
Step Change (10^−3) 9.622 # 3
Path Difference 0.680 # 3
Player Distance 0.427 # 5

Methods used in the Paper