Search Results for author: Ethan Gotlieb Wilcox

Found 4 papers, 2 papers with code

Revisiting the Optimality of Word Lengths

no code implementations6 Dec 2023 Tiago Pimentel, Clara Meister, Ethan Gotlieb Wilcox, Kyle Mahowald, Ryan Cotterell

Under this method, we find that a language's word lengths should instead be proportional to the surprisal's expectation plus its variance-to-mean ratio.

Testing the Predictions of Surprisal Theory in 11 Languages

no code implementations7 Jul 2023 Ethan Gotlieb Wilcox, Tiago Pimentel, Clara Meister, Ryan Cotterell, Roger P. Levy

We address this gap in the current literature by investigating the relationship between surprisal and reading times in eleven different languages, distributed across five language families.

A Targeted Assessment of Incremental Processing in Neural LanguageModels and Humans

1 code implementation6 Jun 2021 Ethan Gotlieb Wilcox, Pranali Vani, Roger P. Levy

We present a targeted, scaled-up comparison of incremental processing in humans and neural language models by collecting by-word reaction time data for sixteen different syntactic test suites across a range of structural phenomena.

Language Modelling Sentence

On the Predictive Power of Neural Language Models for Human Real-Time Comprehension Behavior

1 code implementation2 Jun 2020 Ethan Gotlieb Wilcox, Jon Gauthier, Jennifer Hu, Peng Qian, Roger Levy

Human reading behavior is tuned to the statistics of natural language: the time it takes human subjects to read a word can be predicted from estimates of the word's probability in context.

Open-Ended Question Answering

Cannot find the paper you are looking for? You can Submit a new open access paper.