no code implementations • 7 Feb 2024 • Jiahua Rao, Jiancong Xie, Hanjing Lin, Shuangjia Zheng, Zhen Wang, Yuedong Yang
While such methods could improve GNN predictions, they usually don't perform well on explanations.
no code implementations • 27 Apr 2023 • Jiahua Rao, Zifei Shan, Longpo Liu, Yao Zhou, Yuedong Yang
With the recent progress in large-scale vision and language representation learning, Vision Language Pre-training (VLP) models have achieved promising improvements on various multi-modal downstream tasks.
1 code implementation • 12 May 2022 • Jiahua Rao, Shuangjia Zheng, Sijie Mai, Yuedong Yang
To address these problems, we propose a novel Communicative Subgraph representation learning for Multi-relational Inductive drug-Gene interactions prediction (CoSMIG), where the predictions of drug-gene relations are made through subgraph patterns, and thus are naturally inductive for unseen drugs/genes without retraining or utilizing external domain features.
1 code implementation • 19 Jul 2021 • Jianwen Chen, Shuangjia Zheng, Ying Song, Jiahua Rao, Yuedong Yang
For this sake, we propose a Communicative Message Passing Transformer (CoMPT) neural network to improve the molecular graph representation by reinforcing message interactions between nodes and edges based on the Transformer architecture.
2 code implementations • 1 Jul 2021 • Jiahua Rao, Shuangjia Zheng, Yuedong Yang
Advances in machine learning have led to graph neural network-based methods for drug discovery, yielding promising results in molecular design, chemical synthesis planning, and molecular property prediction.
1 code implementation • 2 Jul 2019 • Shuangjia Zheng, Jiahua Rao, Zhongyue Zhang, Jun Xu, Yuedong Yang
Synthesis planning is the process of recursively decomposing target molecules into available precursors.