Search Results for author: FeiFei Zhai

Found 14 papers, 4 papers with code

Multi-Stage Pre-training Enhanced by ChatGPT for Multi-Scenario Multi-Domain Dialogue Summarization

1 code implementation16 Oct 2023 Weixiao Zhou, Gengyao Li, Xianfu Cheng, Xinnian Liang, Junnan Zhu, FeiFei Zhai, Zhoujun Li

Specifically, we first conduct domain-aware pre-training using large-scale multi-scenario multi-domain dialogue data to enhance the adaptability of our pre-trained model.

dialogue summary

Neural Machine Translation with Explicit Phrase Alignment

no code implementations26 Nov 2019 Jiacheng Zhang, Huanbo Luan, Maosong Sun, FeiFei Zhai, Jingfang Xu, Yang Liu

The lack of alignment in NMT models leads to three problems: it is hard to (1) interpret the translation process, (2) impose lexical constraints, and (3) impose structural constraints.

Machine Translation NMT +1

Improving the Transformer Translation Model with Document-Level Context

3 code implementations EMNLP 2018 Jiacheng Zhang, Huanbo Luan, Maosong Sun, FeiFei Zhai, Jingfang Xu, Min Zhang, Yang Liu

Although the Transformer translation model (Vaswani et al., 2017) has achieved state-of-the-art performance in a variety of translation tasks, how to use document-level context to deal with discourse phenomena problematic for Transformer still remains a challenge.

Sentence Translation

Three Strategies to Improve One-to-Many Multilingual Translation

no code implementations EMNLP 2018 Yining Wang, Jiajun Zhang, FeiFei Zhai, Jingfang Xu, Cheng-qing Zong

However, previous studies show that one-to-many translation based on this framework cannot perform on par with the individually trained models.

Machine Translation Multi-Task Learning +1

Neural Models for Sequence Chunking

1 code implementation15 Jan 2017 Feifei Zhai, Saloni Potdar, Bing Xiang, Bo-Wen Zhou

Many natural language understanding (NLU) tasks, such as shallow parsing (i. e., text chunking) and semantic slot filling, require the assignment of representative labels to the meaningful chunks in a sentence.

Chunking Natural Language Understanding +3

SummaRuNNer: A Recurrent Neural Network based Sequence Model for Extractive Summarization of Documents

7 code implementations14 Nov 2016 Ramesh Nallapati, FeiFei Zhai, Bo-Wen Zhou

We present SummaRuNNer, a Recurrent Neural Network (RNN) based sequence model for extractive summarization of documents and show that it achieves performance better than or comparable to state-of-the-art.

Document Summarization Extractive Summarization +1

Unsupervised Tree Induction for Tree-based Translation

no code implementations TACL 2013 Feifei Zhai, Jiajun Zhang, Yu Zhou, Cheng-qing Zong

In current research, most tree-based translation models are built directly from parse trees.

Machine Translation Translation

Cannot find the paper you are looking for? You can Submit a new open access paper.