Search Results for author: Dai Shi

Found 14 papers, 5 papers with code

Design Your Own Universe: A Physics-Informed Agnostic Method for Enhancing Graph Neural Networks

no code implementations26 Jan 2024 Dai Shi, Andi Han, Lequan Lin, Yi Guo, Zhiyong Wang, Junbin Gao

Physics-informed Graph Neural Networks have achieved remarkable performance in learning through graph-structured data by mitigating common GNN challenges such as over-smoothing, over-squashing, and heterophily adaption.

SpecSTG: A Fast Spectral Diffusion Framework for Probabilistic Spatio-Temporal Traffic Forecasting

no code implementations16 Jan 2024 Lequan Lin, Dai Shi, Andi Han, Junbin Gao

Our method generates the Fourier representation of future time series, transforming the learning process into the spectral domain enriched with spatial information.

Time Series

TransNeXt: Robust Foveal Visual Perception for Vision Transformers

2 code implementations28 Nov 2023 Dai Shi

Due to the depth degradation effect in residual connections, many efficient Vision Transformers models that rely on stacking layers for information exchange often fail to form sufficient information mixing, leading to unnatural visual perception.

Classification Domain Generalization +4

Exposition on over-squashing problem on GNNs: Current Methods, Benchmarks and Challenges

no code implementations13 Nov 2023 Dai Shi, Andi Han, Lequan Lin, Yi Guo, Junbin Gao

Graph-based message-passing neural networks (MPNNs) have achieved remarkable success in both node and graph-level learning tasks.

From Continuous Dynamics to Graph Neural Networks: Neural Diffusion and Beyond

no code implementations16 Oct 2023 Andi Han, Dai Shi, Lequan Lin, Junbin Gao

Such a scheme has been found to be intrinsically linked to a physical process known as heat diffusion, where the propagation of GNNs naturally corresponds to the evolution of heat density.

Bregman Graph Neural Network

1 code implementation12 Sep 2023 Jiayu Zhai, Lequan Lin, Dai Shi, Junbin Gao

Numerous recent research on graph neural networks (GNNs) has focused on formulating GNN architectures as an optimization problem with the smoothness assumption.

Bilevel Optimization Node Classification

Unifying over-smoothing and over-squashing in graph neural networks: A physics informed approach and beyond

no code implementations6 Sep 2023 Zhiqi Shao, Dai Shi, Andi Han, Yi Guo, Qibin Zhao, Junbin Gao

To explore more flexible filtering conditions, we further generalize MHKG into a model termed G-MHKG and thoroughly show the roles of each element in controlling over-smoothing, over-squashing and expressive power.

How Curvature Enhance the Adaptation Power of Framelet GCNs

1 code implementation19 Jul 2023 Dai Shi, Yi Guo, Zhiqi Shao, Junbin Gao

Motivated by the geometric analogy of Ricci curvature in the graph setting, we prove that by inserting the curvature information with different carefully designed transformation function $\zeta$, several known computational issues in GNN such as over-smoothing can be alleviated in our proposed model.

Graph Classification

Frameless Graph Knowledge Distillation

1 code implementation13 Jul 2023 Dai Shi, Zhiqi Shao, Yi Guo, Junbin Gao

Knowledge distillation (KD) has shown great potential for transferring knowledge from a complex teacher model to a simple student model in which the heavy learning task can be accomplished efficiently and without losing too much prediction accuracy.

Graph Representation Learning Knowledge Distillation

Revisiting Generalized p-Laplacian Regularized Framelet GCNs: Convergence, Energy Dynamic and Training with Non-Linear Diffusion

no code implementations25 May 2023 Dai Shi, Zhiqi Shao, Yi Guo, Qibin Zhao, Junbin Gao

We conduct a convergence analysis on pL-UFG, addressing the gap in the understanding of its asymptotic behaviors.

Generalized energy and gradient flow via graph framelets

no code implementations8 Oct 2022 Andi Han, Dai Shi, Zhiqi Shao, Junbin Gao

In this work, we provide a theoretical understanding of the framelet-based graph neural networks through the perspective of energy gradient flow.

A Discussion On the Validity of Manifold Learning

no code implementations3 Jun 2021 Dai Shi, Andi Han, Yi Guo, Junbin Gao

In this work, we investigate the validity of learning results of some widely used DR and ManL methods through the chart mapping function of a manifold.

Dimensionality Reduction speech-recognition +2

Coupling Matrix Manifolds and Their Applications in Optimal Transport

no code implementations15 Nov 2019 Dai Shi, Junbin Gao, Xia Hong, S. T. Boris Choy, Zhiyong Wang

These geometrical features of CMM have paved the way for developing numerical Riemannian optimization algorithms such as Riemannian gradient descent and Riemannian trust-region algorithms, forming a uniform optimization method for all types of OT problems.

Riemannian optimization

Cannot find the paper you are looking for? You can Submit a new open access paper.