1 code implementation • 15 Jun 2020 • Chris Hokamp, Demian Gholipour Ghalandari, Nghia The Pham, John Glover
Sequence-to-sequence (s2s) models are the basis for extensive work in natural language processing.
1 code implementation • ACL 2020 • Demian Gholipour Ghalandari, Chris Hokamp, Nghia The Pham, John Glover, Georgiana Ifrim
Multi-document summarization (MDS) aims to compress the content in large document collections into short summaries and has important applications in story clustering for newsfeeds, presentation of search results, and timeline generation.
no code implementations • 6 Feb 2017 • Gemma Boleda, Sebastian Padó, Nghia The Pham, Marco Baroni
Reference is a crucial property of language that allows us to connect linguistic expressions to the world.
no code implementations • 23 May 2016 • Angeliki Lazaridou, Nghia The Pham, Marco Baroni
We propose an interactive multimodal framework for language learning.
no code implementations • 8 Mar 2016 • Angeliki Lazaridou, Nghia The Pham, Marco Baroni
As a first step towards agents learning to communicate about their visual environment, we propose a system that, given visual representations of a referent (cat) and a context (sofa), identifies their discriminative attributes, i. e., properties that distinguish them (has_tail).
no code implementations • HLT 2015 • Angeliki Lazaridou, Nghia The Pham, Marco Baroni
We extend the SKIP-GRAM model of Mikolov et al. (2013a) by taking visual information into account.