Search Results for author: Niyu Ge

Found 4 papers, 0 papers with code

Long-Short Term Masking Transformer: A Simple but Effective Baseline for Document-level Neural Machine Translation

no code implementations EMNLP 2020 Pei Zhang, Boxing Chen, Niyu Ge, Kai Fan

In this paper, we research extensively the pros and cons of the standard transformer in document-level translation, and find that the auto-regressive property can simultaneously bring both the advantage of the consistency and the disadvantage of error accumulation.

Machine Translation NMT +1

Neural Zero-Inflated Quality Estimation Model For Automatic Speech Recognition System

no code implementations3 Oct 2019 Kai Fan, Jiayi Wang, Bo Li, Shiliang Zhang, Boxing Chen, Niyu Ge, Zhijie Yan

The performances of automatic speech recognition (ASR) systems are usually evaluated by the metric word error rate (WER) when the manually transcribed data are provided, which are, however, expensively available in the real scenario.

Automatic Speech Recognition Automatic Speech Recognition (ASR) +4

Lattice Transformer for Speech Translation

no code implementations ACL 2019 Pei Zhang, Boxing Chen, Niyu Ge, Kai Fan

Recent advances in sequence modeling have highlighted the strengths of the transformer architecture, especially in achieving state-of-the-art machine translation results.

Automatic Speech Recognition Automatic Speech Recognition (ASR) +3

Cannot find the paper you are looking for? You can Submit a new open access paper.