no code implementations • 5 Jan 2023 • Alexander Spangher, Xinyu Hua, Yao Ming, Nanyun Peng
While GPT-2 generates sentences that are remarkably human-like, longer documents can ramble and do not follow human-like writing structure.
no code implementations • Findings (ACL) 2022 • Xinyu Hua, Lu Wang
Combined with transfer learning, substantial F1 score boost (5-25) can be further achieved during the early iterations of active learning across domains.
no code implementations • ACL 2021 • Xinyu Hua, Ashwin Sreevatsa, Lu Wang
To enrich the generation with diverse content, we further propose to use large pre-trained models to predict relevant concepts and to generate claims.
no code implementations • EMNLP 2020 • Xinyu Hua, Lu Wang
In this work, we present a novel content-controlled text generation framework, PAIR, with planning and iterative refinement, which is built upon a large model, BART.
no code implementations • AKBC 2020 • Xinyu Hua, Lei LI, Lifeng Hua, Lu Wang
We therefore propose a novel model, XREF, that leverages attention mechanisms to (1) pinpoint relevant context within comments, and (2) detect supporting entities from the news article.
no code implementations • IJCNLP 2019 • Xinyu Hua, Lu Wang
Building effective text generation systems requires three critical components: content selection, text planning, and surface realization, and traditionally they are tackled as separate problems.
no code implementations • NAACL 2019 • Xinyu Hua, Mitko Nikolov, Nikhil Badugu, Lu Wang
Peer-review plays a critical role in the scientific writing and publication ecosystem.
no code implementations • ACL 2018 • Xinyu Hua, Lu Wang
High quality arguments are essential elements for human reasoning and decision-making processes.
no code implementations • WS 2017 • Xinyu Hua, Lu Wang
We study the problem of domain adaptation for neural abstractive summarization.
no code implementations • ACL 2017 • Xinyu Hua, Lu Wang
We investigate the problem of sentence-level supporting argument detection from relevant documents for user-specified claims.