1 code implementation • 26 Mar 2024 • He Zhu, Junran Wu, Ruomei Liu, Yue Hou, Ze Yuan, Shangzhe Li, YiCheng Pan, Ke Xu
Existing self-supervised methods in natural language processing (NLP), especially hierarchical text classification (HTC), mainly focus on self-supervised contrastive learning, extremely relying on human-designed augmentation rules to generate contrastive samples, which can potentially corrupt or distort the original information.
no code implementations • 1 Feb 2024 • Shangzhe Li, Xinhua Zhang
Deep generative models have recently emerged as an effective approach to offline reinforcement learning.
1 code implementation • 8 May 2023 • Junran Wu, Xueyuan Chen, Bowen Shi, Shangzhe Li, Ke Xu
In contrastive learning, the choice of ``view'' controls the information that the representation captures and influences the performance of the model.
1 code implementation • 26 Jun 2022 • Junran Wu, Xueyuan Chen, Ke Xu, Shangzhe Li
In addition to SEP, we further design two classification models, SEP-G and SEP-N for graph classification and node classification, respectively.
1 code implementation • 6 Jun 2022 • Junran Wu, Shangzhe Li, Jianhao Li, YiCheng Pan, Ke Xu
Inspired by structural entropy on graphs, we transform the data sample from graphs to coding trees, which is a simpler but essential structure for graph data.
1 code implementation • 4 Jun 2021 • Junran Wu, Ke Xu, Xueyuan Chen, Shangzhe Li, Jichang Zhao
Then, structural information, referring to associations among temporal points and the node weights, is extracted from the mapped graphs to resolve the problems regarding long-range dependencies and the chaotic property.