Search Results for author: Shaohui Kuang

Found 9 papers, 1 papers with code

Spiral Language Modeling

no code implementations20 Dec 2021 Yong Cao, Yukun Feng, Shaohui Kuang, Gu Xu

In almost all text generation applications, word sequences are constructed in a left-to-right (L2R) or right-to-left (R2L) manner, as natural language sentences are written either L2R or R2L.

Language Modelling Machine Translation +2

Translation Memory Guided Neural Machine Translation

no code implementations1 Jan 2021 Shaohui Kuang, Heng Yu, Weihua Luo, Qiang Wang

Existing ways either employ extra encoder to encode information from TM or concatenate source sentence and TM sentences as encoder's input.

Language Modelling Machine Translation +4

AR: Auto-Repair the Synthetic Data for Neural Machine Translation

no code implementations5 Apr 2020 Shanbo Cheng, Shaohui Kuang, Rongxiang Weng, Heng Yu, Changfeng Zhu, Weihua Luo

Compared with only using limited authentic parallel data as training corpus, many studies have proved that incorporating synthetic parallel data, which generated by back translation (BT) or forward translation (FT, or selftraining), into the NMT training process can significantly improve translation quality.

Machine Translation NMT +2

Merging External Bilingual Pairs into Neural Machine Translation

no code implementations2 Dec 2019 Tao Wang, Shaohui Kuang, Deyi Xiong, António Branco

As neural machine translation (NMT) is not easily amenable to explicit correction of errors, incorporating pre-specified translations into NMT is widely regarded as a non-trivial challenge.

Machine Translation NMT +1

Learning to Reuse Translations: Guiding Neural Machine Translation with Examples

no code implementations25 Nov 2019 Qian Cao, Shaohui Kuang, Deyi Xiong

In this paper, we study the problem of enabling neural machine translation (NMT) to reuse previous translations from similar examples in target prediction.

Machine Translation NMT +1

Fusing Recency into Neural Machine Translation with an Inter-Sentence Gate Model

no code implementations COLING 2018 Shaohui Kuang, Deyi Xiong

Neural machine translation (NMT) systems are usually trained on a large amount of bilingual sentence pairs and translate one sentence at a time, ignoring inter-sentence information.

Machine Translation NMT +2

Incorporating Chinese Radicals Into Neural Machine Translation: Deeper Than Character Level

1 code implementation3 May 2018 Lifeng Han, Shaohui Kuang

We integrate the Chinese radicals into the NMT model with different settings to address the unseen words challenge in Chinese to English translation.

Machine Translation NMT +1

Attention Focusing for Neural Machine Translation by Bridging Source and Target Embeddings

no code implementations ACL 2018 Shaohui Kuang, Junhui Li, António Branco, Weihua Luo, Deyi Xiong

In neural machine translation, a source sequence of words is encoded into a vector from which a target sequence is generated in the decoding phase.

Machine Translation Sentence +2

Cannot find the paper you are looking for? You can Submit a new open access paper.