1 code implementation • WMT (EMNLP) 2021 • Pinzhen Chen, Jindřich Helcl, Ulrich Germann, Laurie Burchell, Nikolay Bogoychev, Antonio Valerio Miceli Barone, Jonas Waldendorf, Alexandra Birch, Kenneth Heafield
This paper presents the University of Edinburgh’s constrained submissions of English-German and English-Hausa systems to the WMT 2021 shared task on news translation.
no code implementations • IWSLT 2016 • Ondřej Bojar, Ondřej Cífka, Jindřich Helcl, Tom Kocmi, Roman Sudarikov
We present our submissions to the IWSLT 2016 machine translation task, as our first attempt to translate subtitles and one of our early experiments with neural machine translation (NMT).
no code implementations • NAACL 2022 • Jindřich Helcl, Barry Haddow, Alexandra Birch
In this paper, we point out flaws in the evaluation methodology present in the literature on NAR models and we provide a fair comparison between a state-of-the-art NAR model and the autoregressive submissions to the shared task.
no code implementations • 10 Apr 2024 • Martin Popel, Lucie Poláková, Michal Novák, Jindřich Helcl, Jindřich Libovický, Pavel Straňák, Tomáš Krabač, Jaroslava Hlaváčová, Mariia Anisimova, Tereza Chlaňová
We present Charles Translator, a machine translation system between Ukrainian and Czech, developed as part of a society-wide effort to mitigate the impact of the Russian-Ukrainian war on individuals and society.
2 code implementations • 24 Nov 2023 • Nikolay Bogoychev, Jelmer Van der Linde, Graeme Nail, Barry Haddow, Jaume Zaragoza-Bernabeu, Gema Ramírez-Sánchez, Lukas Weymann, Tudor Nicolae Mateiu, Jindřich Helcl, Mikko Aulamo
Developing high quality machine translation systems is a labour intensive, challenging and confusing process for newcomers to the field.
no code implementations • 25 Oct 2023 • Jindřich Helcl, Jindřich Libovický
The goal of the shared task was to develop systems for named entity recognition and question answering in several under-represented languages.
no code implementations • 1 Dec 2022 • Martin Popel, Jindřich Libovický, Jindřich Helcl
We present Charles University submissions to the WMT22 General Translation Shared Task on Czech-Ukrainian and Ukrainian-Czech machine translation.
no code implementations • 1 Dec 2022 • Jindřich Helcl
We present a non-autoregressive system submission to the WMT 22 Efficient Translation Shared Task.
no code implementations • 4 May 2022 • Jindřich Helcl, Barry Haddow, Alexandra Birch
In this paper, we point out flaws in the evaluation methodology present in the literature on NAR models and we provide a fair comparison between a state-of-the-art NAR model and the autoregressive submissions to the shared task.
no code implementations • CL (ACL) 2022 • Barry Haddow, Rachel Bawden, Antonio Valerio Miceli Barone, Jindřich Helcl, Alexandra Birch
We present a survey covering the state of the art in low-resource machine translation research.
no code implementations • 7 Apr 2020 • Zdeněk Kasner, Jindřich Libovický, Jindřich Helcl
Non-autoregressive (nAR) models for machine translation (MT) manifest superior decoding speed when compared to autoregressive (AR) models, at the expense of impaired fluency of their outputs.
no code implementations • WS 2019 • Jindřich Helcl, Jindřich Libovický, Martin Popel
We present our submission to the WMT19 Robustness Task.
no code implementations • 12 Nov 2018 • Jindřich Libovický, Jindřich Helcl, David Mareček
In multi-source sequence-to-sequence tasks, the attention mechanism can be modeled in several ways.
1 code implementation • 12 Nov 2018 • Jindřich Libovický, Jindřich Helcl
Autoregressive decoding is the only part of sequence-to-sequence models that prevents them from massive parallelization at inference time.
no code implementations • 12 Nov 2018 • Jindřich Helcl, Jindřich Libovický, Dušan Variš
For our submission, we acquired both textual and multimodal additional data.
3 code implementations • WS 2017 • Antonio Valerio Miceli Barone, Jindřich Helcl, Rico Sennrich, Barry Haddow, Alexandra Birch
It has been shown that increasing model depth improves the quality of neural machine translation.
no code implementations • 14 Jul 2017 • Jindřich Helcl, Jindřich Libovický
For Task 1 (multimodal translation), our best scoring system is a purely textual neural translation of the source image caption to the target language.
1 code implementation • 21 Apr 2017 • Jindřich Libovický, Jindřich Helcl
Modeling attention in neural multi-source sequence-to-sequence learning remains a relatively unexplored area, despite its usefulness in tasks that incorporate multiple source languages or modalities.
no code implementations • WS 2016 • Jindřich Libovický, Jindřich Helcl, Marek Tlustý, Pavel Pecina, Ondřej Bojar
Neural sequence to sequence learning recently became a very promising paradigm in machine translation, achieving competitive results with statistical phrase-based systems.