Search Results for author: Muhua Zhu

Found 15 papers, 4 papers with code

Exploiting Rich Syntax for Better Knowledge Base Question Answering

no code implementations16 Jul 2021 Pengju Zhang, Yonghui Jia, Muhua Zhu, Wenliang Chen, Min Zhang

Previous works for encoding questions mainly focus on the word sequences, but seldom consider the information from syntactic trees. In this paper, we propose an approach to learn syntax-based representations for KBQA.

Knowledge Base Question Answering

Improving AMR Parsing with Sequence-to-Sequence Pre-training

1 code implementation EMNLP 2020 Dongqin Xu, Junhui Li, Muhua Zhu, Min Zhang, Guodong Zhou

In the literature, the research on abstract meaning representation (AMR) parsing is much restricted by the size of human-curated dataset which is critical to build an AMR parser with good performance.

Ranked #15 on AMR Parsing on LDC2017T10 (using extra training data)

AMR Parsing Machine Translation +1

Coupling Distant Annotation and Adversarial Training for Cross-Domain Chinese Word Segmentation

1 code implementation ACL 2020 Ning Ding, Dingkun Long, Guangwei Xu, Muhua Zhu, Pengjun Xie, Xiaobin Wang, Hai-Tao Zheng

In order to simultaneously alleviate these two issues, this paper proposes to couple distant annotation and adversarial training for cross-domain CWS.

Chinese Word Segmentation Sentence

Modeling Graph Structure in Transformer for Better AMR-to-Text Generation

1 code implementation IJCNLP 2019 Jie Zhu, Junhui Li, Muhua Zhu, Longhua Qian, Min Zhang, Guodong Zhou

Recent studies on AMR-to-text generation often formalize the task as a sequence-to-sequence (seq2seq) learning problem by converting an Abstract Meaning Representation (AMR) graph into a word sequence.

AMR-to-Text Generation Text Generation

Learning When to Attend for Neural Machine Translation

no code implementations31 May 2017 Junhui Li, Muhua Zhu

In the past few years, attention mechanisms have become an indispensable component of end-to-end neural machine translation models.

Machine Translation Translation

Modeling Source Syntax for Neural Machine Translation

no code implementations ACL 2017 Junhui Li, Deyi Xiong, Zhaopeng Tu, Muhua Zhu, Min Zhang, Guodong Zhou

Even though a linguistics-free sequence to sequence model in neural machine translation (NMT) has certain capability of implicitly learning syntactic information of source sentences, this paper shows that source syntax can be explicitly incorporated into NMT effectively to provide further improvements.

Machine Translation NMT +1

Cannot find the paper you are looking for? You can Submit a new open access paper.