no code implementations • ICLR 2022 • Shangmin Guo, Yi Ren, Kory Wallace Mathewson, Simon Kirby, Stefano V Albrecht, Kenny Smith
Researchers are using deep learning models to explore the emergence of language in various language games, where simulated agents interact and develop an emergent language to solve a task.
1 code implementation • 7 Jun 2021 • Shangmin Guo, Yi Ren, Kory Mathewson, Simon Kirby, Stefano V. Albrecht, Kenny Smith
Researchers are using deep learning models to explore the emergence of language in various language games, where agents interact and develop an emergent language to solve tasks.
1 code implementation • 19 Mar 2021 • Andres Karjus, Richard A. Blythe, Simon Kirby, Tianyu Wang, Kenny Smith
Colexification refers to the phenomenon of multiple meanings sharing one word in a language.
1 code implementation • 16 Jun 2020 • Andres Karjus, Richard A. Blythe, Simon Kirby, Kenny Smith
By contrast, in topics which are increasing in importance for language users, near-synonymous words tend not to compete directly and can coexist.
1 code implementation • ICLR 2020 • Yi Ren, Shangmin Guo, Matthieu Labeau, Shay B. Cohen, Simon Kirby
The principle of compositionality, which enables natural language to represent complex concepts via a structured combination of simpler ones, allows us to convey an open-ended set of messages using a limited vocabulary.
1 code implementation • 3 Nov 2018 • Andres Karjus, Richard A. Blythe, Simon Kirby, Kenny Smith
Newberry et al. (Detecting evolutionary forces in language change, Nature 551, 2017) tackle an important but difficult problem in linguistics, the testing of selective theories of language change against a null model of drift.
1 code implementation • 2 Jun 2018 • Andres Karjus, Richard A. Blythe, Simon Kirby, Kenny Smith
In this work, we introduce a simple model for controlling for topical fluctuations in corpora - the topical-cultural advection model - and demonstrate how it provides a robust baseline of variability in word frequency changes over time.
no code implementations • 9 Mar 2017 • Vanessa Ferdinand, Simon Kirby, Kenny Smith
Regularization occurs when the output a learner produces is less variable than the linguistic data they observed.