Search Results for author: Shichang Zhang

Found 10 papers, 5 papers with code

Parameter-Efficient Tuning Large Language Models for Graph Representation Learning

no code implementations28 Apr 2024 Qi Zhu, Da Zheng, Xiang Song, Shichang Zhang, Bowen Jin, Yizhou Sun, George Karypis

Inspired by this, we introduce Graph-aware Parameter-Efficient Fine-Tuning - GPEFT, a novel approach for efficient graph representation learning with LLMs on text-rich graphs.

Graph Representation Learning Link Prediction +1

Predicting and Interpreting Energy Barriers of Metallic Glasses with Graph Neural Networks

no code implementations8 Dec 2023 Haoyu Li, Shichang Zhang, Longwen Tang, Mathieu Bauchy, Yizhou Sun

We demonstrate in our experiments that SymGNN can significantly improve the energy barrier prediction over other GNNs and non-graph machine learning models.

Representation Learning

SciBench: Evaluating College-Level Scientific Problem-Solving Abilities of Large Language Models

1 code implementation20 Jul 2023 Xiaoxuan Wang, Ziniu Hu, Pan Lu, Yanqiao Zhu, Jieyu Zhang, Satyen Subramaniam, Arjun R. Loomba, Shichang Zhang, Yizhou Sun, Wei Wang

Most of the existing Large Language Model (LLM) benchmarks on scientific problem reasoning focus on problems grounded in high-school subjects and are confined to elementary algebraic operations.

Benchmarking Language Modelling +2

Linkless Link Prediction via Relational Distillation

no code implementations11 Oct 2022 Zhichun Guo, William Shiao, Shichang Zhang, Yozen Liu, Nitesh V. Chawla, Neil Shah, Tong Zhao

In this work, to combine the advantages of GNNs and MLPs, we start with exploring direct knowledge distillation (KD) methods for link prediction, i. e., predicted logit-based matching and node representation-based matching.

Knowledge Distillation Link Prediction +1

GStarX: Explaining Graph Neural Networks with Structure-Aware Cooperative Games

1 code implementation28 Jan 2022 Shichang Zhang, Yozen Liu, Neil Shah, Yizhou Sun

Explaining machine learning models is an important and increasingly popular area of research interest.

Attribute Feature Importance +4

Graph-less Neural Networks: Teaching Old MLPs New Tricks via Distillation

1 code implementation ICLR 2022 Shichang Zhang, Yozen Liu, Yizhou Sun, Neil Shah

Conversely, multi-layer perceptrons (MLPs) have no graph dependency and infer much faster than GNNs, even though they are less accurate than GNNs for node classification in general.

Knowledge Distillation Node Classification +2

Graph Condensation for Graph Neural Networks

2 code implementations ICLR 2022 Wei Jin, Lingxiao Zhao, Shichang Zhang, Yozen Liu, Jiliang Tang, Neil Shah

Given the prevalence of large-scale graphs in real-world applications, the storage and time for training neural models have raised increasing concerns.

Motif-Driven Contrastive Learning of Graph Representations

no code implementations23 Dec 2020 Shichang Zhang, Ziniu Hu, Arjun Subramonian, Yizhou Sun

Our framework MotIf-driven Contrastive leaRning Of Graph representations (MICRO-Graph) can: 1) use GNNs to extract motifs from large graph datasets; 2) leverage learned motifs to sample informative subgraphs for contrastive learning of GNN.

Clustering Contrastive Learning +1

Cannot find the paper you are looking for? You can Submit a new open access paper.