Search Results for author: Tongtian Zhu

Found 5 papers, 5 papers with code

Adversarial Erasing with Pruned Elements: Towards Better Graph Lottery Ticket

1 code implementation5 Aug 2023 Yuwen Wang, Shunyu Liu, KaiXuan Chen, Tongtian Zhu, Ji Qiao, Mengjie Shi, Yuanyu Wan, Mingli Song

Graph Lottery Ticket (GLT), a combination of core subgraph and sparse subnetwork, has been proposed to mitigate the computational cost of deep Graph Neural Networks (GNNs) on large input graphs while preserving original performance.

Decentralized SGD and Average-direction SAM are Asymptotically Equivalent

1 code implementation5 Jun 2023 Tongtian Zhu, Fengxiang He, KaiXuan Chen, Mingli Song, DaCheng Tao

Decentralized stochastic gradient descent (D-SGD) allows collaborative learning on massive devices simultaneously without the control of a central server.

Improving Expressivity of GNNs with Subgraph-specific Factor Embedded Normalization

1 code implementation31 May 2023 KaiXuan Chen, Shunyu Liu, Tongtian Zhu, Tongya Zheng, Haofei Zhang, Zunlei Feng, Jingwen Ye, Mingli Song

Graph Neural Networks (GNNs) have emerged as a powerful category of learning architecture for handling graph-structured data.

Contrastive Identity-Aware Learning for Multi-Agent Value Decomposition

1 code implementation23 Nov 2022 Shunyu Liu, Yihe Zhou, Jie Song, Tongya Zheng, KaiXuan Chen, Tongtian Zhu, Zunlei Feng, Mingli Song

Value Decomposition (VD) aims to deduce the contributions of agents for decentralized policies in the presence of only global rewards, and has recently emerged as a powerful credit assignment paradigm for tackling cooperative Multi-Agent Reinforcement Learning (MARL) problems.

Contrastive Learning SMAC+

Topology-aware Generalization of Decentralized SGD

1 code implementation25 Jun 2022 Tongtian Zhu, Fengxiang He, Lan Zhang, Zhengyang Niu, Mingli Song, DaCheng Tao

Our theory indicates that the generalizability of D-SGD is positively correlated with the spectral gap, and can explain why consensus control in initial training phase can ensure better generalization.

Cannot find the paper you are looking for? You can Submit a new open access paper.