Search Results for author: Cheng-qing Zong

Found 71 papers, 10 papers with code

Improving Autoregressive NMT with Non-Autoregressive Model

no code implementations WS 2020 Long Zhou, Jiajun Zhang, Cheng-qing Zong

In this work, we propose a novel Encoder-NAD-AD framework for NMT, aiming at boosting AT with global information produced by NAT model.

Knowledge Distillation Machine Translation +2

Attend, Translate and Summarize: An Efficient Method for Neural Cross-Lingual Summarization

no code implementations ACL 2020 Junnan Zhu, Yu Zhou, Jiajun Zhang, Cheng-qing Zong

Cross-lingual summarization aims at summarizing a document in one language (e. g., Chinese) into another language (e. g., English).

Translation

Neural Machine Translation: Challenges, Progress and Future

1 code implementation13 Apr 2020 Jiajun Zhang, Cheng-qing Zong

Machine translation (MT) is a technique that leverages computers to translate human languages automatically.

Machine Translation NMT +1

Synchronous Speech Recognition and Speech-to-Text Translation with Interactive Decoding

1 code implementation16 Dec 2019 Yuchen Liu, Jiajun Zhang, Hao Xiong, Long Zhou, Zhongjun He, Hua Wu, Haifeng Wang, Cheng-qing Zong

Speech-to-text translation (ST), which translates source language speech into target language text, has attracted intensive attention in recent years.

Automatic Speech Recognition Automatic Speech Recognition (ASR) +4

Synchronously Generating Two Languages with Interactive Decoding

no code implementations IJCNLP 2019 Yining Wang, Jiajun Zhang, Long Zhou, Yuchen Liu, Cheng-qing Zong

In this paper, we introduce a novel interactive approach to translate a source language into two different languages simultaneously and interactively.

Machine Translation NMT +2

NCLS: Neural Cross-Lingual Summarization

1 code implementation IJCNLP 2019 Junnan Zhu, Qian Wang, Yining Wang, Yu Zhou, Jiajun Zhang, Shaonan Wang, Cheng-qing Zong

Moreover, we propose to further improve NCLS by incorporating two related tasks, monolingual summarization and machine translation, into the training process of CLS under multi-task learning.

Machine Translation Multi-Task Learning +1

Are You for Real? Detecting Identity Fraud via Dialogue Interactions

1 code implementation IJCNLP 2019 Weikang Wang, Jiajun Zhang, Qian Li, Cheng-qing Zong, Zhifei Li

In this paper, we focus on identity fraud detection in loan applications and propose to solve this problem with a novel interactive dialogue system which consists of two modules.

Dialogue Management Fraud Detection +1

Understanding Memory Modules on Learning Simple Algorithms

no code implementations1 Jul 2019 Kexin Wang, Yu Zhou, Shaonan Wang, Jiajun Zhang, Cheng-qing Zong

Recent work has shown that memory modules are crucial for the generalization ability of neural networks on learning simple algorithms.

Dimensionality Reduction

Sequence Generation: From Both Sides to the Middle

no code implementations23 Jun 2019 Long Zhou, Jiajun Zhang, Cheng-qing Zong, Heng Yu

The encoder-decoder framework has achieved promising process for many sequence generation tasks, such as neural machine translation and text summarization.

Machine Translation Sentence +2

Memory Consolidation for Contextual Spoken Language Understanding with Dialogue Logistic Inference

no code implementations ACL 2019 He Bai, Yu Zhou, Jiajun Zhang, Cheng-qing Zong

Dialogue contexts are proven helpful in the spoken language understanding (SLU) system and they are typically encoded with explicit memory representations.

Retrieval slot-filling +2

Synchronous Bidirectional Neural Machine Translation

2 code implementations TACL 2019 Long Zhou, Jiajun Zhang, Cheng-qing Zong

In this paper, we introduce a synchronous bidirectional neural machine translation (SB-NMT) that predicts its outputs using left-to-right and right-to-left decoding simultaneously and interactively, in order to leverage both of the history and future information at the same time.

Machine Translation NMT +1

End-to-End Speech Translation with Knowledge Distillation

no code implementations17 Apr 2019 Yuchen Liu, Hao Xiong, Zhongjun He, Jiajun Zhang, Hua Wu, Haifeng Wang, Cheng-qing Zong

End-to-end speech translation (ST), which directly translates from source language speech into target language text, has attracted intensive attentions in recent years.

Knowledge Distillation speech-recognition +2

Synchronous Bidirectional Inference for Neural Sequence Generation

1 code implementation24 Feb 2019 Jiajun Zhang, Long Zhou, Yang Zhao, Cheng-qing Zong

In this work, we propose a synchronous bidirectional inference model to generate outputs using both left-to-right and right-to-left decoding simultaneously and interactively.

Abstractive Text Summarization Machine Translation +1

Language-Independent Representor for Neural Machine Translation

no code implementations1 Nov 2018 Long Zhou, Yuchen Liu, Jiajun Zhang, Cheng-qing Zong, Guoping Huang

Current Neural Machine Translation (NMT) employs a language-specific encoder to represent the source sentence and adopts a language-specific decoder to generate target translation.

Machine Translation Multi-Task Learning +3

Three Strategies to Improve One-to-Many Multilingual Translation

no code implementations EMNLP 2018 Yining Wang, Jiajun Zhang, FeiFei Zhai, Jingfang Xu, Cheng-qing Zong

However, previous studies show that one-to-many translation based on this framework cannot perform on par with the individually trained models.

Machine Translation Multi-Task Learning +1

Memory, Show the Way: Memory Based Few Shot Word Representation Learning

no code implementations EMNLP 2018 Jingyuan Sun, Shaonan Wang, Cheng-qing Zong

Distributional semantic models (DSMs) generally require sufficient examples for a word to learn a high quality representation.

General Classification NER +4

Addressing Troublesome Words in Neural Machine Translation

no code implementations EMNLP 2018 Yang Zhao, Jiajun Zhang, Zhongjun He, Cheng-qing Zong, Hua Wu

One of the weaknesses of Neural Machine Translation (NMT) is in handling lowfrequency and ambiguous words, which we refer as troublesome words.

Machine Translation NMT +1

Associative Multichannel Autoencoder for Multimodal Word Representation

1 code implementation EMNLP 2018 Shaonan Wang, Jiajun Zhang, Cheng-qing Zong

In this paper we address the problem of learning multimodal word representations by integrating textual, visual and auditory inputs.

A Teacher-Student Framework for Maintainable Dialog Manager

no code implementations EMNLP 2018 Weikang Wang, Jiajun Zhang, Han Zhang, Mei-Yuh Hwang, Cheng-qing Zong, Zhifei Li

Specifically, the {``}student{''} is an extended dialog manager based on a new ontology, and the {``}teacher{''} is existing resources used for guiding the learning process of the {``}student{''}.

Reinforcement Learning (RL)

Phrase Table as Recommendation Memory for Neural Machine Translation

no code implementations25 May 2018 Yang Zhao, Yining Wang, Jiajun Zhang, Cheng-qing Zong

Neural Machine Translation (NMT) has drawn much attention due to its promising translation performance recently.

Machine Translation NMT +2

Learning Multimodal Word Representation via Dynamic Fusion Methods

no code implementations2 Jan 2018 Shaonan Wang, Jiajun Zhang, Cheng-qing Zong

Multimodal models have been proven to outperform text-based models on learning semantic word representations.

Learning from Parenthetical Sentences for Term Translation in Machine Translation

no code implementations WS 2017 Guoping Huang, Jiajun Zhang, Yu Zhou, Cheng-qing Zong

Terms extensively exist in specific domains, and term translation plays a critical role in domain-specific machine translation (MT) tasks.

Machine Translation Sentence +1

Investigating Inner Properties of Multimodal Representation and Semantic Compositionality with Brain-based Componential Semantics

no code implementations15 Nov 2017 Shaonan Wang, Jiajun Zhang, Nan Lin, Cheng-qing Zong

Considering that multimodal models are originally motivated by human concept representations, we assume that correlating multimodal representations with brain-based semantics would interpret their inner properties to answer the above questions.

Learning Semantic Representations Natural Language Understanding

Word, Subword or Character? An Empirical Study of Granularity in Chinese-English NMT

1 code implementation13 Nov 2017 Yining Wang, Long Zhou, Jiajun Zhang, Cheng-qing Zong

Our experiments show that subword model performs best for Chinese-to-English translation with the vocabulary which is not so big while hybrid word-character model is most suitable for English-to-Chinese translation.

Machine Translation NMT +1

Towards Neural Machine Translation with Partially Aligned Corpora

no code implementations IJCNLP 2017 Yining Wang, Yang Zhao, Jiajun Zhang, Cheng-qing Zong, Zhengshan Xue

While neural machine translation (NMT) has become the new paradigm, the parameter optimization requires large-scale parallel data which is scarce in many domains and language pairs.

Machine Translation NMT +2

Multi-modal Summarization for Asynchronous Collection of Text, Image, Audio and Video

no code implementations EMNLP 2017 Haoran Li, Junnan Zhu, Cong Ma, Jiajun Zhang, Cheng-qing Zong

In this work, we propose an extractive Multi-modal Summarization (MMS) method which can automatically generate a textual summary given a set of documents, images, audios and videos related to a specific topic.

Automatic Speech Recognition (ASR) Document Summarization +1

Exploiting Word Internal Structures for Generic Chinese Sentence Representation

no code implementations EMNLP 2017 Shaonan Wang, Jiajun Zhang, Cheng-qing Zong

We introduce a novel mixed characterword architecture to improve Chinese sentence representations, by utilizing rich semantic information of word internal structures.

Sentence Sentence Similarity

Look-ahead Attention for Generation in Neural Machine Translation

no code implementations30 Aug 2017 Long Zhou, Jiajun Zhang, Cheng-qing Zong

The attention model has become a standard component in neural machine translation (NMT) and it guides translation process by selectively focusing on parts of the source sentence when predicting each target word.

Machine Translation NMT +2

Neural System Combination for Machine Translation

no code implementations ACL 2017 Long Zhou, Wenpeng Hu, Jiajun Zhang, Cheng-qing Zong

Neural machine translation (NMT) becomes a new approach to machine translation and generates much more fluent results compared to statistical machine translation (SMT).

Machine Translation NMT +1

Shortcut Sequence Tagging

no code implementations3 Jan 2017 Huijia Wu, Jiajun Zhang, Cheng-qing Zong

To simply the stacked architecture, we propose a framework called shortcut block, which is a marriage of the gating mechanism and shortcuts, while discarding the self-connected part in LSTM cell.

POS POS Tagging

Bridging Neural Machine Translation and Bilingual Dictionaries

no code implementations24 Oct 2016 Jiajun Zhang, Cheng-qing Zong

Neural Machine Translation (NMT) has become the new state-of-the-art in several language pairs.

Machine Translation NMT +2

An Empirical Exploration of Skip Connections for Sequential Tagging

no code implementations COLING 2016 Huijia Wu, Jiajun Zhang, Cheng-qing Zong

In this paper, we empirically explore the effects of various kinds of skip connections in stacked bidirectional LSTMs for sequential tagging.

CCG Supertagging POS +1

A Dynamic Window Neural Network for CCG Supertagging

no code implementations10 Oct 2016 Huijia Wu, Jiajun Zhang, Cheng-qing Zong

These motivate us to build a supertagger with a dynamic window approach, which can be treated as an attention mechanism on the local contexts.

CCG Supertagging Sentence +1

Learning Sentence Representation with Guidance of Human Attention

no code implementations29 Sep 2016 Shaonan Wang, Jiajun Zhang, Cheng-qing Zong

Recently, much progress has been made in learning general-purpose sentence representations that can be used across domains.

POS Sentence

One Sentence One Model for Neural Machine Translation

no code implementations LREC 2018 Xiao-Qing Li, Jiajun Zhang, Cheng-qing Zong

Neural machine translation (NMT) becomes a new state-of-the-art and achieves promising translation results using a simple encoder-decoder neural network.

Machine Translation NMT +2

Neural Name Translation Improves Neural Machine Translation

no code implementations7 Jul 2016 Xiao-Qing Li, Jiajun Zhang, Cheng-qing Zong

In order to control computational complexity, neural machine translation (NMT) systems convert all rare words outside the vocabulary into a single unk symbol.

Machine Translation NMT +2

A Bilingual Discourse Corpus and Its Applications

no code implementations LREC 2016 Yang Liu, Jiajun Zhang, Cheng-qing Zong, Yating Yang, Xi Zhou

Existing discourse research only focuses on the monolingual languages and the inconsistency between languages limits the power of the discourse theory in multilingual applications such as machine translation.

Machine Translation Translation

Beyond Word-based Language Model in Statistical Machine Translation

no code implementations5 Feb 2015 Jiajun Zhang, Shujie Liu, Mu Li, Ming Zhou, Cheng-qing Zong

Language model is one of the most important modules in statistical machine translation and currently the word-based language model dominants this community.

Language Modelling Machine Translation +1

Domain Adaptation for Syntactic and Semantic Dependency Parsing Using Deep Belief Networks

no code implementations TACL 2015 Haitong Yang, Tao Zhuang, Cheng-qing Zong

Experiments on English data in the CoNLL 2009 shared task show that our method largely reduced the performance drop on out-of-domain test data.

Dependency Parsing Domain Adaptation +1

Large-scale Word Alignment Using Soft Dependency Cohesion Constraints

no code implementations TACL 2013 Zhiguo Wang, Cheng-qing Zong

In this paper, we take dependency cohesion as a soft constraint, and integrate it into a generative model for large-scale word alignment experiments.

Machine Translation Translation +1

Unsupervised Tree Induction for Tree-based Translation

no code implementations TACL 2013 Feifei Zhai, Jiajun Zhang, Yu Zhou, Cheng-qing Zong

In current research, most tree-based translation models are built directly from parse trees.

Machine Translation Translation

Cannot find the paper you are looking for? You can Submit a new open access paper.