BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension

ACL 2020 Mike LewisYinhan LiuNaman GoyalMarjan GhazvininejadAbdelrahman MohamedOmer LevyVes StoyanovLuke Zettlemoyer

We present BART, a denoising autoencoder for pretraining sequence-to-sequence models. BART is trained by (1) corrupting text with an arbitrary noising function, and (2) learning a model to reconstruct the original text... (read more)

PDF Abstract

Results from the Paper


TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT BENCHMARK
Question Answering SQuAD1.1 dev BART Base (with text infilling) F1 90.8 # 9

Methods used in the Paper