1 code implementation • 4 May 2023 • Fangkai Jiao, Bosheng Ding, Tianze Luo, Zhanfeng Mo
This project focuses on enhancing open-source large language models through instruction-tuning and providing comprehensive evaluations of their performance.
1 code implementation • 16 Nov 2022 • Tianze Luo, Zhanfeng Mo, Sinno Jialin Pan
In this paper, we argue that running full-rank diffusion SDEs on the whole graph adjacency matrix space hinders diffusion models from learning graph topology generation, and hence significantly deteriorates the quality of generated graph data.
no code implementations • 25 Sep 2019 • Hao Chen, Zhanfeng Mo, Qingyi Gao, Zhouwang Yang, Xiao Wang
To better understand the unsupervised model, GANs, we establish the generalization bound, which uniformly holds with respect to the choice of generators.