no code implementations • 26 Jan 2024 • Dai Shi, Andi Han, Lequan Lin, Yi Guo, Zhiyong Wang, Junbin Gao
Physics-informed Graph Neural Networks have achieved remarkable performance in learning through graph-structured data by mitigating common GNN challenges such as over-smoothing, over-squashing, and heterophily adaption.
no code implementations • 16 Jan 2024 • Lequan Lin, Dai Shi, Andi Han, Junbin Gao
Our method generates the Fourier representation of future time series, transforming the learning process into the spectral domain enriched with spatial information.
2 code implementations • 28 Nov 2023 • Dai Shi
Due to the depth degradation effect in residual connections, many efficient Vision Transformers models that rely on stacking layers for information exchange often fail to form sufficient information mixing, leading to unnatural visual perception.
Ranked #15 on Domain Generalization on ImageNet-A
no code implementations • 13 Nov 2023 • Dai Shi, Andi Han, Lequan Lin, Yi Guo, Junbin Gao
Graph-based message-passing neural networks (MPNNs) have achieved remarkable success in both node and graph-level learning tasks.
no code implementations • 16 Oct 2023 • Andi Han, Dai Shi, Lequan Lin, Junbin Gao
Such a scheme has been found to be intrinsically linked to a physical process known as heat diffusion, where the propagation of GNNs naturally corresponds to the evolution of heat density.
1 code implementation • 12 Sep 2023 • Jiayu Zhai, Lequan Lin, Dai Shi, Junbin Gao
Numerous recent research on graph neural networks (GNNs) has focused on formulating GNN architectures as an optimization problem with the smoothness assumption.
Ranked #31 on Node Classification on Texas
no code implementations • 6 Sep 2023 • Zhiqi Shao, Dai Shi, Andi Han, Yi Guo, Qibin Zhao, Junbin Gao
To explore more flexible filtering conditions, we further generalize MHKG into a model termed G-MHKG and thoroughly show the roles of each element in controlling over-smoothing, over-squashing and expressive power.
1 code implementation • 19 Jul 2023 • Dai Shi, Yi Guo, Zhiqi Shao, Junbin Gao
Motivated by the geometric analogy of Ricci curvature in the graph setting, we prove that by inserting the curvature information with different carefully designed transformation function $\zeta$, several known computational issues in GNN such as over-smoothing can be alleviated in our proposed model.
1 code implementation • 13 Jul 2023 • Dai Shi, Zhiqi Shao, Yi Guo, Junbin Gao
Knowledge distillation (KD) has shown great potential for transferring knowledge from a complex teacher model to a simple student model in which the heavy learning task can be accomplished efficiently and without losing too much prediction accuracy.
no code implementations • 25 May 2023 • Dai Shi, Zhiqi Shao, Yi Guo, Qibin Zhao, Junbin Gao
We conduct a convergence analysis on pL-UFG, addressing the gap in the understanding of its asymptotic behaviors.
1 code implementation • 27 Oct 2022 • Zhiqi Shao, Andi Han, Dai Shi, Andrey Vasnev, Junbin Gao
This paper introduces a novel Framelet Graph approach based on p-Laplacian GNN.
no code implementations • 8 Oct 2022 • Andi Han, Dai Shi, Zhiqi Shao, Junbin Gao
In this work, we provide a theoretical understanding of the framelet-based graph neural networks through the perspective of energy gradient flow.
no code implementations • 3 Jun 2021 • Dai Shi, Andi Han, Yi Guo, Junbin Gao
In this work, we investigate the validity of learning results of some widely used DR and ManL methods through the chart mapping function of a manifold.
no code implementations • 15 Nov 2019 • Dai Shi, Junbin Gao, Xia Hong, S. T. Boris Choy, Zhiyong Wang
These geometrical features of CMM have paved the way for developing numerical Riemannian optimization algorithms such as Riemannian gradient descent and Riemannian trust-region algorithms, forming a uniform optimization method for all types of OT problems.