1 code implementation • 16 Jan 2024 • Qiang Qu, Yiran Shen, Xiaoming Chen, Yuk Ying Chung, Tongliang Liu
In this work, we propose \textbf{E2HQV}, a novel E2V paradigm designed to produce high-quality video frames from events.
1 code implementation • 20 Mar 2023 • Qiang Qu, Xiaoming Chen, Yuk Ying Chung, Weidong Cai
In this paper, we propose a novel concept of "anglewise attention" by introducing a multihead self-attention mechanism to the angular domain of an LFI.
no code implementations • 2 Mar 2023 • Seyed Mojtaba Hosseini Bamakan, Nasim Nezhadsistani, Omid Bodaghi, Qiang Qu
The proposed framework provides fundamental elements and guidance for businesses in taking advantage of NFTs in real-world problems such as grant patents, funding, biotechnology, and so forth.
no code implementations • 5 Mar 2019 • Ziyu Liu, Meng Zhou, Weiqing Cao, Qiang Qu, Henry Wing Fung Yeung, Vera Yuk Ying Chung
The game of Chinese Checkers is a challenging traditional board game of perfect information that differs from other traditional games in two main aspects: first, unlike Chess, all checkers remain indefinitely in the game and hence the branching factor of the search tree does not decrease as the game progresses; second, unlike Go, there are also no upper bounds on the depth of the search tree since repetitions and backward movements are allowed.
no code implementations • 20 Feb 2019 • Shuai Yu, Yongbo Wang, Min Yang, Baocheng Li, Qiang Qu, Jialie Shen
In this paper, we develop a neural attentive interpretable recommendation system, named NAIRS.
no code implementations • COLING 2018 • Min Yang, Qiang Qu, Ying Shen, Qiao Liu, Wei Zhao, Jia Zhu
Review text has been widely studied in traditional tasks such as sentiment analysis and aspect extraction.
no code implementations • 23 Apr 2018 • Yang Liu, Qiang Qu, Chao GAO
Finally, we replicate this new block into n copies and concatenate them as the input to the FC layer.
1 code implementation • 7 Mar 2018 • Wenyu Du, Shuai Yu, Min Yang, Qiang Qu, Jia Zhu
Finally, we concatenate the projective vectors from bipartite subnetworks with the ones learned from homogeneous subnetworks to form the final representation of the heterogeneous network.
1 code implementation • 26 Nov 2017 • Linqing Liu, Yao Lu, Min Yang, Qiang Qu, Jia Zhu, Hongyan Li
In this paper, we propose an adversarial process for abstractive text summarization, in which we simultaneously train a generative model G and a discriminative model D. In particular, we build the generator G as an agent of reinforcement learning, which takes the raw text as input and predicts the abstractive summarization.
Ranked #5 on Text Summarization on CNN / Daily Mail (Anonymized)
Abstractive Text Summarization Generative Adversarial Network +2