1 code implementation • GeBNLP (COLING) 2020 • Danielle Saunders, Rosie Sallis, Bill Byrne
Neural Machine Translation (NMT) has been shown to struggle with grammatical gender that is dependent on the gender of human referents, which can cause gender bias effects.
no code implementations • 7 Jun 2023 • Danielle Saunders, Katrina Olsen
The vast majority of work on gender in MT focuses on 'unambiguous' inputs, where gender markers in the source language are expected to be resolved in the output.
1 code implementation • Findings (ACL) 2022 • Danielle Saunders, Rosie Sallis, Bill Byrne
Neural machine translation inference procedures like beam search generate the most likely output under the model.
no code implementations • 14 Apr 2021 • Danielle Saunders
The development of deep learning techniques has allowed Neural Machine Translation (NMT) models to become extremely powerful, given sufficient training data and training time.
no code implementations • AACL (WAT) 2020 • Danielle Saunders, Weston Feely, Bill Byrne
One possible approach to this problem uses sub-character decomposition for training and test sentences.
no code implementations • WMT (EMNLP) 2020 • Danielle Saunders, Bill Byrne
The 2020 WMT Biomedical translation task evaluated Medline abstract translations.
1 code implementation • 11 Oct 2020 • Danielle Saunders, Rosie Sallis, Bill Byrne
Neural Machine Translation (NMT) has been shown to struggle with grammatical gender that is dependent on the gender of human referents, which can cause gender bias effects.
no code implementations • ACL 2020 • Danielle Saunders, Felix Stahlberg, Bill Byrne
We find that each of these lines of research has a clear space in it for the other, and propose merging them with a scheme that allows a document-level evaluation metric to be used in the NMT training objective.
2 code implementations • ACL 2020 • Danielle Saunders, Bill Byrne
During inference we propose a lattice-rescoring scheme which outperforms all systems evaluated in Stanovsky et al (2019) on WinoMT with no degradation of general test set BLEU, and we show this scheme can be applied to remove gender bias in the output of `black box` online commercial MT systems.
no code implementations • WS 2019 • Felix Stahlberg, Danielle Saunders, Adri{\`a} de Gispert, Bill Byrne
Two techniques provide the fabric of the Cambridge University Engineering Department{'}s (CUED) entry to the WMT19 evaluation campaign: elastic weight consolidation (EWC) and different forms of language modelling (LMs).
no code implementations • WS 2019 • Danielle Saunders, Felix Stahlberg, Bill Byrne
The 2019 WMT Biomedical translation task involved translating Medline abstracts.
no code implementations • 11 Jun 2019 • Felix Stahlberg, Danielle Saunders, Adria de Gispert, Bill Byrne
Two techniques provide the fabric of the Cambridge University Engineering Department's (CUED) entry to the WMT19 evaluation campaign: elastic weight consolidation (EWC) and different forms of language modelling (LMs).
no code implementations • ACL 2019 • Danielle Saunders, Felix Stahlberg, Adria de Gispert, Bill Byrne
We investigate adaptive ensemble weighting for Neural Machine Translation, addressing the case of improving performance on a new and potentially unknown domain without sacrificing performance on the original domain.
1 code implementation • WS 2018 • Felix Stahlberg, Danielle Saunders, Bill Byrne
We propose to achieve explainable neural machine translation (NMT) by changing the output representation to explain itself.
no code implementations • ACL 2018 • Danielle Saunders, Felix Stahlberg, Adria de Gispert, Bill Byrne
We explore strategies for incorporating target syntax into Neural Machine Translation.
no code implementations • WS 2018 • Felix Stahlberg, Danielle Saunders, Gonzalo Iglesias, Bill Byrne
SGNMT is a decoding platform for machine translation which allows paring various modern neural models of translation with different kinds of constraints and symbolic models.
1 code implementation • EMNLP 2017 • Felix Stahlberg, Eva Hasler, Danielle Saunders, Bill Byrne
This paper introduces SGNMT, our experimental platform for machine translation research.