Machine translation is the task of translating a sentence in a source language to a different target language.
|Trend||Dataset||Best Method||Paper title||Paper||Code||Compare|
Several mechanisms to focus attention of a neural network on selected parts of its input or memory have been used successfully in deep learning models in recent years.
#26 best model for Machine Translation on WMT2014 English-French
Dictionaries and phrase tables are the basis of modern statistical machine translation systems.
We therefore propose Cross-View Training (CVT), a semi-supervised learning algorithm that improves the representations of a Bi-LSTM sentence encoder using a mix of labeled and unlabeled data.
SOTA for CCG Supertagging on CCGBank
Recent works have highlighted the strength of the Transformer architecture on sequence tasks while, at the same time, neural architecture search (NAS) has begun to outperform human-designed models.
#2 best model for Machine Translation on WMT2014 English-German
Feed-forward and convolutional architectures have recently been shown to achieve superior results on some sequence modeling tasks such as machine translation, with the added advantage that they concurrently process all inputs in the sequence, leading to easy parallelization and faster training times.
#5 best model for Machine Translation on WMT2014 English-German
On the WMT 2014 English-to-German and English-to-French translation tasks, this approach yields improvements of 1. 3 BLEU and 0. 3 BLEU over absolute position representations, respectively.
#3 best model for Machine Translation on WMT2014 English-German
We propose to improve the representation in sequence models by augmenting current approaches with an autoencoder that is forced to compress the sequence through an intermediate discrete latent space.