Search Results for author: Jean Maillard

Found 21 papers, 7 papers with code

Towards Being Parameter-Efficient: A Stratified Sparsely Activated Transformer with Dynamic Capacity

1 code implementation3 May 2023 Haoran Xu, Maha Elbayad, Kenton Murray, Jean Maillard, Vedanuj Goswami

Mixture-of-experts (MoE) models that employ sparse activation have demonstrated effectiveness in significantly increasing the number of parameters while maintaining low computational requirements per token.

Machine Translation Translation

Language-Aware Multilingual Machine Translation with Self-Supervised Learning

1 code implementation10 Feb 2023 Haoran Xu, Jean Maillard, Vedanuj Goswami

In this work, we first investigate how to utilize intra-distillation to learn more *language-specific* parameters and then show the importance of these language-specific parameters.

Cross-Lingual Transfer Denoising +3

Toxicity in Multilingual Machine Translation at Scale

no code implementations6 Oct 2022 Marta R. Costa-jussà, Eric Smith, Christophe Ropers, Daniel Licht, Jean Maillard, Javier Ferrando, Carlos Escolano

We evaluate and analyze added toxicity when translating a large evaluation dataset (HOLISTICBIAS, over 472k sentences, covering 13 demographic axes) from English into 164 languages.

Hallucination Machine Translation +1

Text normalization for low-resource languages: the case of Ligurian

1 code implementation16 Jun 2022 Stefano Lusito, Edoardo Ferrante, Jean Maillard

Text normalization is a crucial technology for low-resource languages which lack rigid spelling conventions or that have undergone multiple spelling reforms.

Conversational Semantic Parsing

no code implementations EMNLP 2020 Armen Aghajanyan, Jean Maillard, Akshat Shrivastava, Keith Diedrick, Mike Haeger, Haoran Li, Yashar Mehdad, Ves Stoyanov, Anuj Kumar, Mike Lewis, Sonal Gupta

In this paper, we propose a semantic representation for such task-oriented conversational systems that can represent concepts such as co-reference and context carryover, enabling comprehensive understanding of queries in a session.

dialog state tracking Semantic Parsing

Decoding Brain Activity Associated with Literal and Metaphoric Sentence Comprehension Using Distributional Semantic Models

no code implementations TACL 2020 Vesna G. Djokic, Jean Maillard, Luana Bulat, Ekaterina Shutova

We evaluate a range of semantic models (word embeddings, compositional, and visual models) in their ability to decode brain activity associated with reading of both literal and metaphoric sentences.

Sentence Word Embeddings

Modeling Affirmative and Negated Action Processing in the Brain with Lexical and Compositional Semantic Models

no code implementations ACL 2019 Vesna Djokic, Jean Maillard, Luana Bulat, Ekaterina Shutova

Recent work shows that distributional semantic models can be used to decode patterns of brain activity associated with individual words and sentence meanings.

Negation Semantic Composition +1

Latent Tree Learning with Differentiable Parsers: Shift-Reduce Parsing and Chart Parsing

no code implementations WS 2018 Jean Maillard, Stephen Clark

Latent tree learning models represent sentences by composing their words according to an induced parse tree, all based on a downstream task.

Cannot find the paper you are looking for? You can Submit a new open access paper.