no code implementations • PaM 2020 • Mehrnoosh Sadrzadeh, Gijs Wijnholds
It is possible to overcome the computational hurdles by working with fuzzy generalised quantifiers.
no code implementations • 7 Feb 2024 • Kin Ian Lo, Mehrnoosh Sadrzadeh, Shane Mansfield
Then, we show how an extension of the natural language processing challenge, known as the Winograd Schema, which involves anaphoric ambiguities can be modelled on the Bell-CHSH scenario with a contextual fraction of 0. 096.
1 code implementation • 1 Dec 2023 • Hadi Wazni, Mehrnoosh Sadrzadeh
Previous work extended the QNLP translation to discourse structure using points in a closure of Hilbert spaces.
no code implementations • 31 Aug 2023 • Kin Ian Lo, Mehrnoosh Sadrzadeh, Shane Mansfield
In this work, we focus on coreference ambiguities and investigate the Winograd Schema Challenge (WSC), a test proposed by Levesque in 2011 to evaluate the intelligence of machines.
no code implementations • 1 Aug 2023 • Michael Moortgat, Mehrnoosh Sadrzadeh
By calling into question the implicit structural rules that are taken for granted in classical logic, substructural logics have brought to the fore new forms of reasoning with applications in many interdisciplinary areas of interest.
no code implementations • 11 Aug 2022 • Kin Ian Lo, Mehrnoosh Sadrzadeh, Shane Mansfield
Ambiguities of natural language do not preclude us from using it and context helps in getting ideas across.
no code implementations • 10 Aug 2022 • Hadi Wazni, Kin Ian Lo, Lachlan McPheat, Mehrnoosh Sadrzadeh
We use the Lambek Calculus with soft sub-exponential modalities to model and reason about discourse relations such as anaphora and ellipsis.
no code implementations • 14 Jun 2022 • Daphne Wang, Mehrnoosh Sadrzadeh
Ambiguity is a natural language phenomenon occurring at different levels of syntax, semantics, and pragmatics.
1 code implementation • 14 Feb 2022 • Manuel Accettulli Huber, Adriana Correia, Sanjaye Ramgoolam, Mehrnoosh Sadrzadeh
The Linguistic Matrix Theory programme introduced by Kartsaklis, Ramgoolam and Sadrzadeh is an approach to the statistics of matrices that are generated in type-driven distributional semantics, based on permutation invariant polynomial functions which are regarded as the key observables encoding the significant statistics.
no code implementations • 22 Nov 2021 • Lachlan McPheat, Hadi Wazni, Mehrnoosh Sadrzadeh
We develop a vector space semantics for Lambek Calculus with Soft Subexponentials, apply the calculus to construct compositional vector interpretations for parasitic gap noun phrases and discourse units with anaphora and ellipsis, and experiment with the constructions in a distributional sentence similarity task.
no code implementations • 23 Sep 2021 • Matej Dostal, Mehrnoosh Sadrzadeh, Gijs Wijnholds
We show that this category is a concrete instantiation of the compositional distributional model.
no code implementations • 23 Sep 2021 • Mehrnoosh Sadrzadeh
Pregroup grammars were developed in 1999 and stayed Lambek's preferred algebraic model of grammar.
no code implementations • ACL (SemSpace, IWCS) 2021 • Daphne Wang, Mehrnoosh Sadrzadeh, Samson Abramsky, Victor H. Cervantes
Language is contextual as meanings of words are dependent on their contexts.
1 code implementation • CONLL 2020 • Gijs Wijnholds, Mehrnoosh Sadrzadeh, Stephen Clark
This paper is about learning word representations using grammatical type information.
no code implementations • 23 Sep 2020 • Saba Nazir. Taner Cagali, Chris Newell, Mehrnoosh Sadrzadeh
Multimodal information originates from a variety of sources: audiovisual files, textual descriptions, and metadata.
no code implementations • 12 May 2020 • Michael Moortgat, Mehrnoosh Sadrzadeh, Gijs Wijnholds
The interpretation of parasitic gaps is an ostensible case of non-linearity in natural language composition.
no code implementations • 6 May 2020 • Lachlan McPheat, Mehrnoosh Sadrzadeh, Hadi Wazni, Gijs Wijnholds
We develop a categorical compositional distributional semantics for Lambek Calculus with a Relevant Modality ! L*, which has a limited edition of the contraction and permutation rules.
no code implementations • 2 Jan 2020 • Dan Shiebler, Alexis Toumi, Mehrnoosh Sadrzadeh
In this work we define formal grammars in terms of free monoidal categories, along with a functor from the category of formal grammars to the category of automata.
1 code implementation • 19 Dec 2019 • Sanjaye Ramgoolam, Mehrnoosh Sadrzadeh, Lewis Sword
Using the general 13-parameter permutation invariant Gaussian matrix models recently solved, we find, using a dataset of matrices constructed via standard techniques in distributional semantics, that the expectation values of a large class of cubic and quartic observables show high gaussianity at levels between 90 to 99 percent.
1 code implementation • NAACL 2019 • Gijs Wijnholds, Mehrnoosh Sadrzadeh
Our results show that non-linear addition and a non-linear tensor-based composition outperform the naive non-compositional baselines and the linear models, and that sentence encoders perform well on sentence similarity, but not on verb disambiguation.
no code implementations • 5 May 2019 • Gijs Wijnholds, Mehrnoosh Sadrzadeh
We review previous compositional distributional models of relative pronouns, coordination and a restricted account of ellipsis in the DisCoCat framework of Coecke et al. (2010, 2013).
no code implementations • 8 Nov 2018 • Gijs Wijnholds, Mehrnoosh Sadrzadeh
This paper compares classical copying and quantum entanglement in natural language by considering the case of verb phrase (VP) ellipsis.
no code implementations • 1 Nov 2018 • Mehrnoosh Sadrzadeh, Matthew Purver, Julian Hough, Ruth Kempson
One of the fundamental requirements for models of semantic processing in dialogue is incrementality: a model must reflect how people interpret and generate language at least on a word-by-word basis, and handle phenomena such as fragments, incomplete and jointly-produced utterances.
no code implementations • 26 Oct 2018 • Mehrnoosh Sadrzadeh, Reinhard Muskens
Vector models of language are based on the contextual aspects of language, the distributions of words and how they co-occur in text.
no code implementations • 28 Mar 2017 • Dimitrios Kartsaklis, Sanjaye Ramgoolam, Mehrnoosh Sadrzadeh
We propose a Matrix Theory approach to this data, based on permutation symmetry along with Gaussian weights and their perturbations.
no code implementations • COLING 2016 • Mehrnoosh Sadrzadeh, Dimitri Kartsaklis
Compositional distributional models of meaning (CDMs) provide a function that produces a vectorial representation for a phrase or a sentence by composing the vectors of its words.
no code implementations • COLING 2016 • Dimitri Kartsaklis, Mehrnoosh Sadrzadeh
According to the distributional inclusion hypothesis, entailment between words can be measured via the feature inclusions of their distributional vectors.
no code implementations • 4 Aug 2016 • Mehrnoosh Sadrzadeh
In previous work with J. Hedges, we formalised a generalised quantifiers theory of natural language in categorical compositional distributional semantics with the help of bialgebras.
no code implementations • 4 Feb 2016 • Jules Hedges, Mehrnoosh Sadrzadeh
Categorical compositional distributional semantics is a model of natural language; it combines the statistical vector space models of words with the compositional models of grammar.
no code implementations • 14 Dec 2015 • Esma Balkir, Dimitri Kartsaklis, Mehrnoosh Sadrzadeh
In categorical compositional distributional semantics, phrase and sentence representations are functions of their grammatical structure and representations of the words therein.
no code implementations • 22 Jun 2015 • Esma Balkir, Mehrnoosh Sadrzadeh, Bob Coecke
Categorical compositional distributional model of Coecke et al. (2010) suggests a way to combine grammatical composition of the formal, type logical models with the corpus based, empirical word representations of distributional semantics.
no code implementations • WS 2015 • Dimitri Kartsaklis, Mehrnoosh Sadrzadeh
The categorical compositional distributional model of Coecke, Sadrzadeh and Clark provides a linguistically motivated procedure for computing the meaning of a sentence as a function of the distributional meaning of the words therein.
no code implementations • 3 Feb 2015 • Robin Piedeleu, Dimitri Kartsaklis, Bob Coecke, Mehrnoosh Sadrzadeh
Moreover, just like CQM allows for varying the model in which we interpret quantum axioms, one can also vary the model in which we interpret word meaning.
no code implementations • EMNLP 2014 • Dmitrijs Milajevs, Dimitri Kartsaklis, Mehrnoosh Sadrzadeh, Matthew Purver
We provide a comparative study between neural word representations and traditional vector spaces based on co-occurrence counts, in a number of compositional tasks.
no code implementations • ACL 2014 • Dimitri Kartsaklis, Nal Kalchbrenner, Mehrnoosh Sadrzadeh
This paper provides a method for improving tensor-based compositional distributional models of meaning by the addition of an explicit disambiguation step prior to composition.
no code implementations • 18 Jun 2014 • Mehrnoosh Sadrzadeh, Stephen Clark, Bob Coecke
Within the categorical compositional distributional model of meaning, we provide semantic interpretations for the subject and object roles of the possessive relative pronoun `whose'.
no code implementations • 12 May 2014 • Dimitri Kartsaklis, Mehrnoosh Sadrzadeh
In both quantum mechanics and corpus linguistics based on vector spaces, the notion of entanglement provides a means for the various subsystems to communicate with each other.
no code implementations • 21 Apr 2014 • Mehrnoosh Sadrzadeh, Stephen Clark, Bob Coecke
This paper develops a compositional vector-based semantics of subject and object relative pronouns within a categorical framework.
no code implementations • 13 Mar 2014 • Samson Abramsky, Mehrnoosh Sadrzadeh
Language is contextual and sheaf theory provides a high level mathematical framework to model contextuality.
no code implementations • 23 Jan 2014 • Dimitri Kartsaklis, Mehrnoosh Sadrzadeh, Stephen Pulman, Bob Coecke
They also provide semantics for Lambek's pregroup algebras, applied to formalizing the grammatical structure of natural language, and are implicit in a distributional model of word meaning based on vector spaces.
no code implementations • 2 May 2013 • Stephen Clark, Bob Coecke, Edward Grefenstette, Stephen Pulman, Mehrnoosh Sadrzadeh
We discuss an algorithm which produces the meaning of a sentence given meanings of its words, and its resemblance to quantum teleportation.
1 code implementation • 20 Jun 2011 • Edward Grefenstette, Mehrnoosh Sadrzadeh
The evaluation is based on the word disambiguation task developed by Mitchell and Lapata (2008) for intransitive sentences, and on a similar new experiment designed for transitive sentences.
2 code implementations • 23 Mar 2010 • Bob Coecke, Mehrnoosh Sadrzadeh, Stephen Clark
We propose a mathematical framework for a unification of the distributional theory of meaning in terms of vector space models, and a compositional theory for grammatical types, for which we rely on the algebra of Pregroups, introduced by Lambek.