Search Results for author: Tamer Alkhouli

Found 17 papers, 1 papers with code

Neural Simultaneous Speech Translation Using Alignment-Based Chunking

no code implementations WS 2020 Patrick Wilken, Tamer Alkhouli, Evgeny Matusov, Pavel Golik

In simultaneous machine translation, the objective is to determine when to produce a partial translation given a continuous stream of source words, with a trade-off between latency and quality.

Chunking Machine Translation +3

On The Alignment Problem In Multi-Head Attention-Based Neural Machine Translation

no code implementations WS 2018 Tamer Alkhouli, Gabriel Bretschner, Hermann Ney

This work investigates the alignment problem in state-of-the-art multi-head attention models based on the transformer architecture.

Machine Translation Translation

Neural Hidden Markov Model for Machine Translation

no code implementations ACL 2018 Weiyue Wang, Derui Zhu, Tamer Alkhouli, Zixuan Gan, Hermann Ney

Attention-based neural machine translation (NMT) models selectively focus on specific source positions to produce a translation, which brings significant improvements over pure encoder-decoder sequence-to-sequence models.

Machine Translation NMT +1

RETURNN as a Generic Flexible Neural Toolkit with Application to Translation and Speech Recognition

3 code implementations ACL 2018 Albert Zeyer, Tamer Alkhouli, Hermann Ney

We compare the fast training and decoding speed of RETURNN of attention models for translation, due to fast CUDA LSTM kernels, and a fast pure TensorFlow beam search decoder.

speech-recognition Speech Recognition +1

Hybrid Neural Network Alignment and Lexicon Model in Direct HMM for Statistical Machine Translation

no code implementations ACL 2017 Weiyue Wang, Tamer Alkhouli, Derui Zhu, Hermann Ney

Recently, the neural machine translation systems showed their promising performance and surpassed the phrase-based systems for most translation tasks.

Machine Translation Translation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.