no code implementations • EMNLP 2021 • Eva Hasler, Tobias Domhan, Jonay Trenous, Ke Tran, Bill Byrne, Felix Hieber
Building neural machine translation systems to perform well on a specific target domain is a well-studied problem.
no code implementations • 17 Apr 2024 • Dawei Zhu, Sony Trenous, Xiaoyu Shen, Dietrich Klakow, Bill Byrne, Eva Hasler
Recent research has shown that large language models (LLMs) can achieve remarkable translation performance through supervised fine-tuning (SFT) using only a small amount of parallel data.
1 code implementation • 1 Dec 2023 • Jannis Vamvas, Tobias Domhan, Sony Trenous, Rico Sennrich, Eva Hasler
Neural metrics trained on human evaluations of MT tend to correlate well with human judgments, but their behavior is not fully understood.
no code implementations • 24 Oct 2022 • Tsz Kin Lam, Eva Hasler, Felix Hieber
Customer feedback can be an important signal for improving commercial machine translation systems.
1 code implementation • 10 Oct 2022 • Christos Baziotis, Prashant Mathur, Eva Hasler
A major open problem in neural machine translation (NMT) is the translation of idiomatic expressions, such as "under the weather".
1 code implementation • NAACL 2022 • Tobias Domhan, Eva Hasler, Ke Tran, Sony Trenous, Bill Byrne, Felix Hieber
Vocabulary selection, or lexical shortlisting, is a well-known technique to improve latency of Neural Machine Translation models by constraining the set of allowed output words during inference.
no code implementations • WS 2019 • Weston Feely, Eva Hasler, Adri{\`a} de Gispert
In the Japanese language different levels of honorific speech are used to convey respect, deference, humility, formality and social distance.
no code implementations • NAACL 2018 • Eva Hasler, Adrià De Gispert, Gonzalo Iglesias, Bill Byrne
Despite the impressive quality improvements yielded by neural machine translation (NMT) systems, controlling their translation output to adhere to user-provided terminology constraints remains an open problem.
no code implementations • NAACL 2018 • Gonzalo Iglesias, William Tambellini, Adrià De Gispert, Eva Hasler, Bill Byrne
We describe a batched beam decoding algorithm for NMT with LMBR n-gram posteriors, showing that LMBR techniques still yield gains on top of the best recently reported results with Transformers.
1 code implementation • WS 2017 • Eva Hasler, Felix Stahlberg, Marcus Tomalin, Adri`a de Gispert, Bill Byrne
We compare several language models for the word-ordering task and propose a new bag-to-sequence neural model based on attention-based sequence-to-sequence models.
1 code implementation • EMNLP 2017 • Felix Stahlberg, Eva Hasler, Danielle Saunders, Bill Byrne
This paper introduces SGNMT, our experimental platform for machine translation research.
no code implementations • EACL 2017 • Felix Stahlberg, Adrià De Gispert, Eva Hasler, Bill Byrne
This makes our approach much more flexible than $n$-best list or lattice rescoring as the neural decoder is not restricted to the SMT search space.
no code implementations • WS 2016 • Felix Stahlberg, Eva Hasler, Bill Byrne
This paper presents the University of Cambridge submission to WMT16.
no code implementations • ACL 2016 • Felix Stahlberg, Eva Hasler, Aurelien Waite, Bill Byrne
We investigate the use of hierarchical phrase-based SMT lattices in end-to-end neural machine translation (NMT).
1 code implementation • 15 Oct 2015 • Desmond Elliott, Stella Frank, Eva Hasler
In this paper we present an approach to multi-language image description bringing together insights from neural machine translation and neural image description.