Search Results for author: Jan Rosendahl

Found 14 papers, 1 papers with code

Analysis of Positional Encodings for Neural Machine Translation

no code implementations EMNLP (IWSLT) 2019 Jan Rosendahl, Viet Anh Khoa Tran, Weiyue Wang, Hermann Ney

In this work we analyze and compare the behavior of the Transformer architecture when using different positional encoding methods.

Machine Translation Sentence +1

The RWTH Aachen Machine Translation Systems for IWSLT 2017

no code implementations IWSLT 2017 Parnia Bahar, Jan Rosendahl, Nick Rossenbach, Hermann Ney

This work describes the Neural Machine Translation (NMT) system of the RWTH Aachen University developed for the English$German tracks of the evaluation campaign of the International Workshop on Spoken Language Translation (IWSLT) 2017.

Domain Adaptation Machine Translation +2

Detecting Various Types of Noise for Neural Machine Translation

no code implementations Findings (ACL) 2022 Christian Herold, Jan Rosendahl, Joris Vanvinckenroye, Hermann Ney

The filtering and/or selection of training data is one of the core aspects to be considered when building a strong machine translation system. In their influential work, Khayrallah and Koehn (2018) investigated the impact of different types of noise on the performance of machine translation systems. In the same year the WMT introduced a shared task on parallel corpus filtering, which went on to be repeated in the following years, and resulted in many different filtering approaches being proposed. In this work we aim to combine the recent achievements in data filtering with the original analysis of Khayrallah and Koehn (2018) and investigate whether state-of-the-art filtering systems are capable of removing all the suggested noise types. We observe that most of these types of noise can be detected with an accuracy of over 90% by modern filtering systems when operating in a well studied high resource setting. However, we also find that when confronted with more refined noise categories or when working with a less common language pair, the performance of the filtering systems is far from optimal, showing that there is still room for improvement in this area of research.

Machine Translation Translation

Recurrent Attention for the Transformer

no code implementations EMNLP (insights) 2021 Jan Rosendahl, Christian Herold, Frithjof Petrick, Hermann Ney

In this work, we conduct a comprehensive investigation on one of the centerpieces of modern machine translation systems: the encoder-decoder attention mechanism.

Machine Translation Translation

Efficient Sequence Training of Attention Models using Approximative Recombination

no code implementations18 Oct 2021 Nils-Philipp Wynands, Wilfried Michel, Jan Rosendahl, Ralf Schlüter, Hermann Ney

Lastly, it is shown that this technique can be used to effectively perform sequence discriminative training for attention-based encoder-decoder acoustic models on the LibriSpeech task.

Automatic Speech Recognition Automatic Speech Recognition (ASR) +1

The RWTH Aachen University Machine Translation Systems for WMT 2019

no code implementations WS 2019 Jan Rosendahl, Christian Herold, Yunsu Kim, Miguel Gra{\c{c}}a, Weiyue Wang, Parnia Bahar, Yingbo Gao, Hermann Ney

For the De-En task, none of the tested methods gave a significant improvement over last years winning system and we end up with the same performance, resulting in 39. 6{\%} BLEU on newstest2019.

Attribute Language Modelling +3

The RWTH Aachen University Supervised Machine Translation Systems for WMT 2018

1 code implementation WS 2018 Julian Schamper, Jan Rosendahl, Parnia Bahar, Yunsu Kim, Arne Nix, Hermann Ney

In total we improve by 6. 8{\%} BLEU over our last year{'}s submission and by 4. 8{\%} BLEU over the winning system of the 2017 German→English task.

Machine Translation Translation

Cannot find the paper you are looking for? You can Submit a new open access paper.