Distractor Generation
12 papers with code • 1 benchmarks • 2 datasets
Given a passage, a question, and an answer phrase, the goal of distractor generation (DG) is to generate context-related wrong options (i.e., distractor) for multiple-choice questions (MCQ).
Latest papers
CDGP: Automatic Cloze Distractor Generation based on Pre-trained Language Model
Manually designing cloze test consumes enormous time and efforts.
A Novel Multi-Stage Prompting Approach for Language Agnostic MCQ Generation using GPT
We introduce a multi-stage prompting approach (MSP) for the generation of multiple choice questions (MCQs), harnessing the capabilities of GPT models such as text-davinci-003 and GPT-4, renowned for their excellence across various NLP tasks.
BRAINTEASER: Lateral Thinking Puzzles for Large Language Models
The success of language models has inspired the NLP community to attend to tasks that require implicit and complex reasoning, relying on human-like commonsense mechanisms.
Distractor generation for multiple-choice questions with predictive prompting and large language models
We also show the gains of our approach 1 in generating high-quality distractors by comparing it with a zero-shot ChatGPT and a few-shot ChatGPT prompted with static examples.
EduQG: A Multi-format Multiple Choice Dataset for the Educational Domain
Thus, our versatile dataset can be used for both question and distractor generation, as well as to explore new challenges such as question format conversion.
BERT-based distractor generation for Swedish reading comprehension questions using a small-scale dataset
An important part when constructing multiple-choice questions (MCQs) for reading comprehension assessment are the distractors, the incorrect but preferably plausible answer options.
ZmBART: An Unsupervised Cross-lingual Transfer Framework for Language Generation
In this framework, we further pre-train mBART sequence-to-sequence denoising auto-encoder model with an auxiliary task using monolingual data of three languages.
Quiz-Style Question Generation for News Stories
As a first step towards measuring news informedness at a scale, we study the problem of quiz-style multiple-choice question generation, which may be used to survey users about their knowledge of recent news.
A BERT-based Distractor Generation Scheme with Multi-tasking and Negative Answer Training Strategies.
In this paper, we investigate the following two limitations for the existing distractor generation (DG) methods.
A BERT-based Distractor Generation Scheme with Multi-tasking and Negative Answer Training Strategies
In this paper, we investigate the following two limitations for the existing distractor generation (DG) methods.