The University of Sydney's Machine Translation System for WMT19

WS 2019  ·  Liang Ding, DaCheng Tao ·

This paper describes the University of Sydney's submission of the WMT 2019 shared news translation task. We participated in the Finnish$\rightarrow$English direction and got the best BLEU(33.0) score among all the participants. Our system is based on the self-attentional Transformer networks, into which we integrated the most recent effective strategies from academic research (e.g., BPE, back translation, multi-features data selection, data augmentation, greedy model ensemble, reranking, ConMBR system combination, and post-processing). Furthermore, we propose a novel augmentation method $Cycle Translation$ and a data mixture strategy $Big$/$Small$ parallel construction to entirely exploit the synthetic corpus. Extensive experiments show that adding the above techniques can make continuous improvements of the BLEU scores, and the best result outperforms the baseline (Transformer ensemble model trained with the original parallel corpus) by approximately 5.3 BLEU score, achieving the state-of-the-art performance.

PDF Abstract WS 2019 PDF WS 2019 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Machine Translation WMT2016 Finnish-English CT+B/S construction BLEU 32.4 # 1
Machine Translation WMT2017 Finnish-English CT+B/S construction BLEU 35.5 # 1
Machine Translation WMT 2018 Finnish-English CT+B/S construction BLEU 26.5 # 1
Machine Translation WMT2019 Finnish-English CT+B/S construction BLEU 34.1 # 1

Methods