Search Results for author: Kunlun He

Found 8 papers, 6 papers with code

CausalTime: Realistically Generated Time-series for Benchmarking of Causal Discovery

1 code implementation3 Oct 2023 Yuxiao Cheng, Ziqian Wang, Tingxiong Xiao, Qin Zhong, Jinli Suo, Kunlun He

This study introduces the CausalTime pipeline to generate time-series that highly resemble the real data and with ground truth causal graphs for quantitative performance evaluation.

Benchmarking Causal Discovery +1

Medical Federated Model with Mixture of Personalized and Sharing Components

1 code implementation26 Jun 2023 Yawei Zhao, Qinghe Liu, Xinwang Liu, Kunlun He

Comparing with 13 existing related methods, the proposed method successfully achieves the best model performance, and meanwhile up to 60% improvement of communication efficiency.

Federated Learning Tumor Segmentation

CUTS+: High-dimensional Causal Discovery from Irregular Time-series

1 code implementation10 May 2023 Yuxiao Cheng, Lianglong Li, Tingxiong Xiao, Zongren Li, Qin Zhong, Jinli Suo, Kunlun He

Causal discovery in time-series is a fundamental problem in the machine learning community, enabling causal reasoning and decision-making in complex scenarios.

Causal Discovery Decision Making +3

Self-Supervised Temporal Graph learning with Temporal and Structural Intensity Alignment

no code implementations15 Feb 2023 Meng Liu, Ke Liang, Yawei Zhao, Wenxuan Tu, Sihang Zhou, Xinbiao Gan, Xinwang Liu, Kunlun He

To address this issue, we propose a self-supervised method called S2T for temporal graph learning, which extracts both temporal and structural information to learn more informative node representations.

Graph Learning

Real order total variation with applications to the loss functions in learning schemes

no code implementations10 Apr 2022 Pan Liu, Xin Yang Lu, Kunlun He

Loss function are an essential part in modern data-driven approach, such as bi-level training scheme and machine learnings.

Cannot find the paper you are looking for? You can Submit a new open access paper.