Search Results for author: Gaichao Li

Found 5 papers, 1 papers with code

Diversified Node Sampling based Hierarchical Transformer Pooling for Graph Representation Learning

no code implementations31 Oct 2023 Gaichao Li, Jinsong Chen, John E. Hopcroft, Kun He

Graph pooling methods have been widely used on downsampling graphs, achieving impressive results on multiple graph-level tasks like graph classification and graph generation.

Graph Classification Graph Generation +1

SignGT: Signed Attention-based Graph Transformer for Graph Representation Learning

no code implementations17 Oct 2023 Jinsong Chen, Gaichao Li, John E. Hopcroft, Kun He

In this way, SignGT could learn informative node representations from both long-range dependencies and local topology information.

Graph Representation Learning Node Classification

Tokenized Graph Transformer with Neighborhood Augmentation for Node Classification in Large Graphs

no code implementations22 May 2023 Jinsong Chen, Chang Liu, Kaiyuan Gao, Gaichao Li, Kun He

Graph Transformers, emerging as a new architecture for graph representation learning, suffer from the quadratic complexity on the number of nodes when handling large graphs.

Data Augmentation Graph Representation Learning +1

Adaptive Multi-Neighborhood Attention based Transformer for Graph Representation Learning

no code implementations15 Nov 2022 Gaichao Li, Jinsong Chen, Kun He

MNA-GT further employs an attention layer to learn the importance of different attention kernels to enable the model to adaptively capture the graph structural information for different nodes.

Graph Representation Learning

NAGphormer: A Tokenized Graph Transformer for Node Classification in Large Graphs

1 code implementation10 Jun 2022 Jinsong Chen, Kaiyuan Gao, Gaichao Li, Kun He

In this work, we observe that existing graph Transformers treat nodes as independent tokens and construct a single long sequence composed of all node tokens so as to train the Transformer model, causing it hard to scale to large graphs due to the quadratic complexity on the number of nodes for the self-attention computation.

Graph Learning Graph Mining +1

Cannot find the paper you are looking for? You can Submit a new open access paper.