no code implementations • 16 Apr 2024 • Lijun Liu, Jiali Yang, Jianfei Song, Xinglin Yang, Lele Niu, Zeqi Cai, Hui Shi, Tingjun Hou, Chang-Yu Hsieh, Weiran Shen, Yafeng Deng
Additionally, in the absence of AAV9 capsid data, apart from one wild-type sequence, we used the same model to directly generate a number of viable sequences with up to 9 mutations.
no code implementations • 15 Mar 2024 • Odin Zhang, Yufei Huang, Shichen Cheng, Mengyao Yu, Xujun Zhang, Haitao Lin, Yundian Zeng, Mingyang Wang, Zhenxing Wu, Huifeng Zhao, Zaixi Zhang, Chenqing Hua, Yu Kang, Sunliang Cui, Peichen Pan, Chang-Yu Hsieh, Tingjun Hou
Most earlier 3D structure-based molecular generation approaches follow an atom-wise paradigm, incrementally adding atoms to a partially built molecular fragment within protein pockets.
no code implementations • 16 Feb 2024 • Yiheng Zhu, Zitai Kong, Jialu Wu, Weize Liu, Yuqiang Han, Mingze Yin, Hongxia Xu, Chang-Yu Hsieh, Tingjun Hou
To set the stage, we first outline the foundational tasks in protein sequence design in terms of the constraints involved and present key generative models and optimization algorithms.
no code implementations • 5 Nov 2023 • Yue Wan, Jialu Wu, Tingjun Hou, Chang-Yu Hsieh, Xiaowei Jia
Self-supervised learning (SSL) has emerged as a popular solution, utilizing large-scale, unannotated molecular data to learn a foundational representation of chemical space that might be advantageous for downstream tasks.
no code implementations • 4 Aug 2023 • Haotian Zhang, Huifeng Zhao, Xujun Zhang, Qun Su, Hongyan Du, Chao Shen, Zhe Wang, Dan Li, Peichen Pan, Guangyong Chen, Yu Kang, Chang-Yu Hsieh, Tingjun Hou
Drug discovery is a highly complicated process, and it is unfeasible to fully commit it to the recently developed molecular generation methods.
1 code implementation • 22 Jun 2023 • Tianyue Wang, Xujun Zhang, Odin Zhang, Peichen Pan, Guangyong Chen, Yu Kang, Chang-Yu Hsieh, Tingjun Hou
Protein loop modeling is the most challenging yet highly non-trivial task in protein structure prediction.
1 code implementation • 15 May 2023 • Yiheng Zhu, Zhenqiu Ouyang, Ben Liao, Jialu Wu, Yixuan Wu, Chang-Yu Hsieh, Tingjun Hou, Jian Wu
However, limited attention is paid to hierarchical generative models, which can exploit the inherent hierarchical structure (with rich semantic information) of the molecular graphs and generate complex molecules of larger size that we shall demonstrate to be difficult for most existing models.
no code implementations • 12 Apr 2023 • Zaixi Zhang, Qi Liu, Chee-Kong Lee, Chang-Yu Hsieh, Enhong Chen
Our extensive investigation reveals that the 2D topology and 3D geometry contain intrinsically complementary information in molecule design, and provide new insights into machine learning-based molecule representation and generation.
1 code implementation • NeurIPS 2023 • Yiheng Zhu, Jialu Wu, Chaowen Hu, Jiahuan Yan, Chang-Yu Hsieh, Tingjun Hou, Jian Wu
Many crucial scientific problems involve designing novel molecules with desired properties, which can be formulated as a black-box optimization problem over the discrete chemical space.
1 code implementation • 3 Jan 2023 • Jonathan P. Mailoa, Zhaofeng Ye, Jiezhong Qiu, Chang-Yu Hsieh, Shengyu Zhang
The generation of small molecule candidate (ligand) binding poses in its target protein pocket is important for computer-aided drug discovery.
1 code implementation • Nature Machine Intelligence 2022 • Yuquan Li, Chang-Yu Hsieh, Ruiqiang Lu, Xiaoqing Gong, Xiaorui Wang, Pengyong Li, Shuo Liu, Yanan Tian, Dejun Jiang, Jiaxian Yan, Qifeng Bai, Huanxiang Liu, Shengyu Zhang, Xiaojun Yao
In fact, the pursuit of high prediction performance on a limited number of datasets has crystallized their architectures and hyperparameters, making them lose advantage in repurposing to new data generated in drug discovery.
Ranked #1 on Drug Discovery on ToxCast (Toxicity Forecaster)
1 code implementation • 29 Jan 2022 • Yue Wan, Benben Liao, Chang-Yu Hsieh, Shengyu Zhang
In this paper, we propose Retroformer, a novel Transformer-based architecture for retrosynthesis prediction without relying on any cheminformatics tools for molecule editing.
no code implementations • 15 Sep 2021 • Junsheng Kong, Weizhao Li, Zeyi Liu, Ben Liao, Jiezhong Qiu, Chang-Yu Hsieh, Yi Cai, Shengyu Zhang
In this work, we show that with merely a small fraction of contexts (Q-contexts)which are typical in the whole corpus (and their mutual information with words), one can construct high-quality word embedding with negligible errors.
1 code implementation • Chemical Engineering Journal 2021 • Xiaorui Wang, Yuquan Li, Jiezhong Qiu, Guangyong Chen, Huanxiang Liu, Benben Liao, Chang-Yu Hsieh, Xiaojun Yaoa
RetroPrime achieves the Top-1 accuracy of 64. 8% and 51. 4%, when the reaction type is known and unknown, respectively, in the USPTO-50 K dataset.
Ranked #13 on Single-step retrosynthesis on USPTO-50k
1 code implementation • 17 Aug 2021 • Yijia Xiao, Jiezhong Qiu, Ziang Li, Chang-Yu Hsieh, Jie Tang
The emergence of deep learning models makes modeling data patterns in large quantities of data possible.
no code implementations • 11 Mar 2021 • Shi-Xin Zhang, Chang-Yu Hsieh, Shengyu Zhang, Hong Yao
For instance, a key component of VQAs is the design of task-dependent parameterized quantum circuits (PQCs) as in the case of designing a good neural architecture in deep learning.
Neural Architecture Search Quantum Physics
1 code implementation • 4 Nov 2020 • Pengyong Li, Yuquan Li, Chang-Yu Hsieh, Shengyu Zhang, Xianggen Liu, Huanxiang Liu, Sen Song, Xiaojun Yao
These advantages have established TrimNet as a powerful and useful computational tool in solving the challenging problem of molecular representation learning.
Ranked #1 on Drug Discovery on MUV
1 code implementation • 16 Oct 2020 • Shi-Xin Zhang, Chang-Yu Hsieh, Shengyu Zhang, Hong Yao
Hereby, we propose a general framework of differentiable quantum architecture search (DQAS), which enables automated designs of quantum circuits in an end-to-end differentiable fashion.
Quantum Physics
no code implementations • 18 Jun 2020 • Jonathan Allcock, Chang-Yu Hsieh
We propose a quantum algorithm for training nonlinear support vector machines (SVM) for feature space learning where classical input data is encoded in the amplitudes of quantum states.
1 code implementation • 22 Jun 2019 • Guangyong Chen, Pengfei Chen, Chang-Yu Hsieh, Chee-Kong Lee, Benben Liao, Renjie Liao, Weiwen Liu, Jiezhong Qiu, Qiming Sun, Jie Tang, Richard Zemel, Shengyu Zhang
We introduce a new molecular dataset, named Alchemy, for developing machine learning models useful in chemistry and material science.
no code implementations • 13 Jun 2019 • Pengfei Chen, Weiwen Liu, Chang-Yu Hsieh, Guangyong Chen, Shengyu Zhang
The IGNN model is based on an elegant and fundamental idea in information theory as explained in the main text, and it could be easily generalized beyond the contexts of molecular graphs considered in this work.
1 code implementation • 15 May 2019 • Guangyong Chen, Pengfei Chen, Yujun Shi, Chang-Yu Hsieh, Benben Liao, Shengyu Zhang
Our work is based on an excellent idea that whitening the inputs of neural networks can achieve a fast convergence speed.
no code implementations • 7 Dec 2018 • Jonathan Allcock, Chang-Yu Hsieh, Iordanis Kerenidis, Shengyu Zhang
The running times of our algorithms can be quadratically faster in the size of the network than their standard classical counterparts since they depend linearly on the number of neurons in the network, as opposed to the number of connections between neurons as in the classical case.