Distractor Generation

13 papers with code • 1 benchmarks • 2 datasets

Given a passage, a question, and an answer phrase, the goal of distractor generation (DG) is to generate context-related wrong options (i.e., distractor) for multiple-choice questions (MCQ).

Datasets


Most implemented papers

Generating Distractors for Reading Comprehension Questions from Real Examinations

Evan-Gao/Distractor-Generation-RACE 8 Sep 2018

We investigate the task of distractor generation for multiple choice reading comprehension questions from examinations.

Distractor Generation for Multiple Choice Questions Using Learning to Rank

harrylclc/LTR-DG WS 2018

We investigate how machine learning models, specifically ranking models, can be used to select useful distractors for multiple choice questions.

A BERT-based Distractor Generation Scheme with Multi-tasking and Negative Answer Training Strategies

voidful/BDG 12 Oct 2020

In this paper, we investigate the following two limitations for the existing distractor generation (DG) methods.

Quiz-Style Question Generation for News Stories

google-research-datasets/NewsQuizQA 18 Feb 2021

As a first step towards measuring news informedness at a scale, we study the problem of quiz-style multiple-choice question generation, which may be used to survey users about their knowledge of recent news.

ZmBART: An Unsupervised Cross-lingual Transfer Framework for Language Generation

kaushal0494/ZmBART Findings (ACL) 2021

In this framework, we further pre-train mBART sequence-to-sequence denoising auto-encoder model with an auxiliary task using monolingual data of three languages.

BERT-based distractor generation for Swedish reading comprehension questions using a small-scale dataset

dkalpakchi/swequad-mc INLG (ACL) 2021

An important part when constructing multiple-choice questions (MCQs) for reading comprehension assessment are the distractors, the incorrect but preferably plausible answer options.

EduQG: A Multi-format Multiple Choice Dataset for the Educational Domain

hadifar/question-generation 12 Oct 2022

Thus, our versatile dataset can be used for both question and distractor generation, as well as to explore new challenges such as question format conversion.

Distractor generation for multiple-choice questions with predictive prompting and large language models

semerekiros/distractgpt 30 Jul 2023

We also show the gains of our approach 1 in generating high-quality distractors by comparing it with a zero-shot ChatGPT and a few-shot ChatGPT prompted with static examples.

BRAINTEASER: Lateral Thinking Puzzles for Large Language Models

giannispana/ails-ntua-at-semeval-2024-task-9-brainteaser 8 Oct 2023

The success of language models has inspired the NLP community to attend to tasks that require implicit and complex reasoning, relying on human-like commonsense mechanisms.