SHCT: A Successively Hierarchical Conditional Transformer for Controllable Paraphrase Generation

ACL ARR November 2021  ·  Anonymous ·

Paraphrase generation has consistently been a challenging area in the field of NLP. Despite the considerable achievements made by previous work, existing methods lack a flexible way to include multiple controllable attributes to enhance the diversity of paraphrased sentences. To overcome this challenge, we propose a Successively Hierarchical Conditional Transformer(SHCT) to tackle this task. SHCT is based on a combination of Conditional Variational AutoEncoder(CVAE) with Transformer framework to benefit from the advantage of generating diversified words. More specifically, our SHCT deploys multi-head attention and dynamic memory mechanism to keep the interaction between each of the attributes and the corresponding encoder layer hidden state. To address the problem of absorbing flexible attributes, we apply a hierarchical structure to our SHCT which enables the framework to couple the CVAE latent variables with encoder layer hidden states successively. In addition, Our SHCT is trained by minimizing a tailor-designed loss for producing paraphrased sentences as required. Finally, We conduct extensive experiments to substantiate the validity and effectiveness of our proposed model. The results show that SHCT significantly outperforms the existing state-of-the-art approaches and generates more diverse paraphrased sentences.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods