Search Results for author: Yanglan Gan

Found 1 papers, 0 papers with code

Large Language Model Meets Graph Neural Network in Knowledge Distillation

no code implementations8 Feb 2024 Shengxiang Hu, Guobing Zou, Song Yang, Yanglan Gan, Bofeng Zhang, Yixin Chen

Despite recent community revelations about the advancements and potential applications of Large Language Models (LLMs) in understanding Text-Attributed Graph (TAG), the deployment of LLMs for production is hindered by its high computational and storage requirements, as well as long latencies during model inference.

Contrastive Learning Knowledge Distillation +4

Cannot find the paper you are looking for? You can Submit a new open access paper.