1 code implementation • CoNLL (EMNLP) 2021 • Mitja Nikolaus, Abdellah Fourtassi
In this work, we propose a model integrating both perception- and production-based learning using artificial neural networks which we train on a large corpus of crowd-sourced images with corresponding descriptions.
no code implementations • EMNLP (CMCL) 2020 • Abdellah Fourtassi
The free association task has been very influential both in cognitive science and in computational linguistics.
no code implementations • EMNLP (CMCL) 2020 • Thomas Misiek, Benoit Favre, Abdellah Fourtassi
Interactive alignment is a major mechanism of linguistic coordination.
no code implementations • NAACL (CMCL) 2021 • Franck Dary, Alexis Nasr, Abdellah Fourtassi
In this paper we describe our contribution to the CMCL 2021 Shared Task, which consists in predicting 5 different eye tracking variables from English tokenized text.
1 code implementation • NAACL (CMCL) 2021 • Mitja Nikolaus, Abdellah Fourtassi
When learning their native language, children acquire the meanings of words and sentences from highly ambiguous input without much explicit supervision.
no code implementations • 21 Mar 2024 • Mitja Nikolaus, Abhishek Agrawal, Petros Kaklamanis, Alex Warstadt, Abdellah Fourtassi
The acquisition of grammar has been a central question to adjudicate between theories of language acquisition.
1 code implementation • 21 Oct 2022 • Mitja Nikolaus, Emmanuelle Salin, Stephane Ayache, Abdellah Fourtassi, Benoit Favre
Recent advances in vision-and-language modeling have seen the development of Transformer architectures that achieve remarkable performance on multimodal reasoning tasks.
no code implementations • WS 2019 • Abdellah Fourtassi, Isaac Scheinfeld, Michael Frank
How do children learn abstract concepts such as animal vs. artifact?