1 code implementation • 29 Mar 2024 • Xiangfei Qiu, Jilin Hu, Lekui Zhou, Xingjian Wu, Junyang Du, Buang Zhang, Chenjuan Guo, Aoying Zhou, Christian S. Jensen, Zhenli Sheng, Bin Yang
Next, we employ TFB to perform a thorough evaluation of 21 Univariate Time Series Forecasting (UTSF) methods on 8, 068 univariate time series and 14 Multivariate Time Series Forecasting (MTSF) methods on 25 datasets.
1 code implementation • 15 Jan 2024 • Yunshi Lan, Xinyuan Li, Hanyue Du, Xuesong Lu, Ming Gao, Weining Qian, Aoying Zhou
Natural Language Processing (NLP) aims to analyze text or speech via techniques in the computer science field.
no code implementations • 29 Aug 2023 • Jianing Wang, Chengyu Wang, Cen Chen, Ming Gao, Jun Huang, Aoying Zhou
We propose TransPrompt v2, a novel transferable prompting framework for few-shot learning across similar or distant text classification tasks.
no code implementations • 17 Feb 2023 • Jianing Wang, Chengyu Wang, Jun Huang, Ming Gao, Aoying Zhou
Neural sequence labeling (NSL) aims at assigning labels for input language tokens, which covers a broad range of applications, such as named entity recognition (NER) and slot filling, etc.
1 code implementation • 5 Feb 2023 • Chengcheng Han, Yuhe Wang, Yingnan Fu, Xiang Li, Minghui Qiu, Ming Gao, Aoying Zhou
Few-shot learning has been used to tackle the problem of label scarcity in text classification, of which meta-learning based methods have shown to be effective, such as the prototypical networks (PROTO).
1 code implementation • 28 Dec 2022 • Jianxiang Yu, Qingqing Ge, Xiang Li, Aoying Zhou
In addition, we propose a variant model AdaMEOW that adaptively learns soft-valued weights of negative samples to further improve node representation.
1 code implementation • 27 May 2022 • Tingting Liu, Chengyu Wang, Cen Chen, Ming Gao, Aoying Zhou
With top-$k$ sparse attention, the most crucial attention relation can be obtained with a lower computational cost.
1 code implementation • 26 Apr 2022 • Yu Wang, Yu Dong, Xuesong Lu, Aoying Zhou
Current deep learning models for code summarization generally follow the principle in neural machine translation and adopt the encoder-decoder framework, where the encoder learns the semantic representations from source code and the decoder transforms the learnt representations into human-readable text that describes the functionality of code snippets.
no code implementations • 11 Dec 2021 • Renyu Zhu, Dongxiang Zhang, Chengcheng Han, Ming Gao, Xuesong Lu, Weining Qian, Aoying Zhou
More specifically, we construct a bipartite graph for programming problem embedding, and design an improved pre-training model PLCodeBERT for code embedding, as well as a double-sequence RNN model with exponential decay attention for effective feature fusion.
1 code implementation • Findings (ACL) 2021 • Chengcheng Han, Zeqiu Fan, Dongxiang Zhang, Minghui Qiu, Ming Gao, Aoying Zhou
Meta-learning has emerged as a trending technique to tackle few-shot text classification and achieved state-of-the-art performance.
1 code implementation • 29 Nov 2020 • Na Li, Renyu Zhu, Xiaoxu Zhou, Xiangnan He, Wenyuan Cai, Ming Gao, Aoying Zhou
In this paper, we model the author disambiguation as a collaboration network reconstruction problem, and propose an incremental and unsupervised author disambiguation method, namely IUAD, which performs in a bottom-up manner.
1 code implementation • 27 Nov 2020 • Yixin Cao, Jun Kuang, Ming Gao, Aoying Zhou, Yonggang Wen, Tat-Seng Chua
In this paper, we propose a general approach to learn relation prototypesfrom unlabeled texts, to facilitate the long-tail relation extraction by transferring knowledge from the relation types with sufficient trainingdata.
1 code implementation • 6 Jul 2020 • Yingnan Fu, Tingting Liu, Ming Gao, Aoying Zhou
The symbol-level image encoder of EDSL consists of segmentation module and reconstruction module.
1 code implementation • 8 Jul 2019 • Jun Kuang, Yixin Cao, Jianbing Zheng, Xiangnan He, Ming Gao, Aoying Zhou
In contrast to existing distant supervision approaches that suffer from insufficient training corpora to extract relations, our proposal of mining implicit mutual relation from the massive unlabeled corpora transfers the semantic information of entity pairs into the RE model, which is more expressive and semantically plausible.
no code implementations • ACL 2019 • Chengyu Wang, Xiaofeng He, Aoying Zhou
Lexical relations describe how meanings of terms relate to each other.
no code implementations • 17 May 2019 • Shuaifeng Pang, Xiaodong Qi, Zhao Zhang, Cheqing Jin, Aoying Zhou
Although the emergence of the programmable smart contract makes blockchain systems easily embrace a wider range of industrial areas, how to execute smart contracts efficiently becomes a big challenge nowadays.
Databases Distributed, Parallel, and Cluster Computing
1 code implementation • 16 Jan 2019 • Ming Gao, Xiangnan He, Leihui Chen, Tingting Liu, Jinglin Zhang, Aoying Zhou
Recent years have witnessed a widespread increase of interest in network representation learning (NRL).
no code implementations • EMNLP 2017 • Chengyu Wang, Xiaofeng He, Aoying Zhou
A taxonomy is a semantic hierarchy, consisting of concepts linked by is-a relations.
no code implementations • EMNLP 2017 • Chengyu Wang, Yan Fan, Xiaofeng He, Aoying Zhou
User generated categories (UGCs) are short texts that reflect how people describe and organize entities, expressing rich semantic relations implicitly.
no code implementations • ACL 2017 • Chengyu Wang, Junchi Yan, Aoying Zhou, Xiaofeng He
Finding the correct hypernyms for entities is essential for taxonomy learning, fine-grained entity categorization, query understanding, etc.