no code implementations • 22 Nov 2020 • Karthikeya Ramesh Kaushik, Andrea E. Martin
Drawing on linguistics and set theory, a formalisation of these ideas is presented in the first half of this thesis.
no code implementations • Findings of the Association for Computational Linguistics 2020 • Maryam Hashemzadeh, Greta Kaufeld, Martha White, Andrea E. Martin, Alona Fyshe
The representations generated by many models of language (word embeddings, recurrent neural networks and transformers) correlate to brain activity recorded while people read.
no code implementations • 11 Oct 2019 • Leonidas A. A. Doumas, Guillermo Puebla, Andrea E. Martin, John E. Hummel
People readily generalize knowledge to novel domains and stimuli.
1 code implementation • 12 May 2019 • Guillermo Puebla, Andrea E. Martin, Leonidas A. A. Doumas
In the present study we tested the Story Gestalt model (St. John, 1992), a classic PDP model of text comprehension, and a Sequence-to-Sequence with Attention model (Bahdanau et al., 2015), a contemporary deep learning architecture for text processing.
no code implementations • 2 Oct 2018 • Andrea E. Martin, Leonidas A. A. Doumas
Humans learn complex latent structures from their environments (e. g., natural language, mathematics, music, social hierarchies).
no code implementations • 5 Jun 2018 • Leonidas A. A. Doumas, Guillermo Puebla, Andrea E. Martin
Humans readily generalize, applying prior knowledge to novel situations and stimuli.