no code implementations • IJCNLP 2019 • Liang Wang, Wei Zhao, Ruoyu Jia, Sujian Li, Jingming Liu
This paper presents a new sequence-to-sequence (seq2seq) pre-training method PoDA (Pre-training of Denoising Autoencoders), which learns representations suitable for text generation tasks.
6 code implementations • NAACL 2019 • Wei Zhao, Liang Wang, Kewei Shen, Ruoyu Jia, Jingming Liu
It is the first time copying words from the source context and fully pre-training a sequence to sequence model are experimented on the GEC task.
Ranked #4 on Grammatical Error Correction on JFLEG
no code implementations • COLING 2018 • Liang Wang, Sujian Li, Wei Zhao, Kewei Shen, Meng Sun, Ruoyu Jia, Jingming Liu
Cloze-style reading comprehension has been a popular task for measuring the progress of natural language understanding in recent years.