Grounded language learning

23 papers with code • 0 benchmarks • 1 datasets

Acquire the meaning of language in situated environments.

Datasets


Most implemented papers

Pretraining on Interactions for Learning Grounded Affordance Representations

jmerullo/affordances *SEM (NAACL) 2022

Lexical semantics and cognitive science point to affordances (i. e. the actions that objects support) as critical for understanding and representing nouns and verbs.

Compositional Generalization in Grounded Language Learning via Induced Model Sparsity

aalto-ai/sparse-compgen NAACL (ACL) 2022

We provide a study of how induced model sparsity can help achieve compositional generalization and better sample efficiency in grounded language learning problems.

Lexicon-Level Contrastive Visual-Grounding Improves Language Modeling

EvLab-MIT/LexiContrastiveGrd 21 Mar 2024

Today's most accurate language models are trained on orders of magnitude more language data than human language learners receive - but with no supervision from other sensory modalities that play a crucial role in human learning.