no code implementations • 1 Jan 2021 • Seohyun Back, Akhil Kedia, Sai Chetan Chinthakindi, Haejun Lee, Jaegul Choo
We evaluate our method against existing ones in terms of the quality of generated questions as well as the fine-tuned MRC model accuracy after training on the data synthetically generated by our method.
Ranked #3 on Question Generation on SQuAD1.1 (using extra training data)
no code implementations • ICLR 2020 • Seohyun Back, Sai Chetan Chinthakindi, Akhil Kedia, Haejun Lee, Jaegul Choo
Real-world question answering systems often retrieve potentially relevant documents to a given question through a keyword search, followed by a machine reading comprehension (MRC) step to find the exact answer from them.
no code implementations • 25 Sep 2019 • Akhil Kedia, Sai Chetan Chinthakindi, Seohyun Back, Haejun Lee, Jaegul Choo
We evaluate the question generation capability of our method by comparing the BLEU score with existing methods and test our method by fine-tuning the MRC model on the downstream MRC data after training on synthetic data.
no code implementations • EMNLP 2018 • Sathish Reddy Indurthi, Seunghak Yu, Seohyun Back, Heriberto Cuay{\'a}huitl
In recent years many deep neural networks have been proposed to solve Reading Comprehension (RC) tasks.
Ranked #4 on Question Answering on NarrativeQA
no code implementations • EMNLP 2018 • Seohyun Back, Seunghak Yu, Sathish Reddy Indurthi, Jihie Kim, Jaegul Choo
Machine reading comprehension helps machines learn to utilize most of the human knowledge written in the form of text.
Ranked #27 on Question Answering on TriviaQA (using extra training data)
no code implementations • WS 2018 • Seunghak Yu, Sathish Reddy Indurthi, Seohyun Back, Haejun Lee
Reading Comprehension (RC) of text is one of the fundamental tasks in natural language processing.
Ranked #69 on Question Answering on SQuAD1.1