Search Results for author: Minghong Gao

Found 2 papers, 0 papers with code

A Survey on Recent Teacher-student Learning Studies

no code implementations10 Apr 2023 Minghong Gao

Recent variants of knowledge distillation include teaching assistant distillation, curriculum distillation, mask distillation, and decoupling distillation, which aim to improve the performance of knowledge distillation by introducing additional components or by changing the learning process.

Knowledge Distillation

ChemiRise: a data-driven retrosynthesis engine

no code implementations9 Aug 2021 Xiangyan Sun, Ke Liu, Yuquan Lin, Lingjie Wu, Haoming Xing, Minghong Gao, Ji Liu, Suocheng Tan, Zekun Ni, Qi Han, Junqiu Wu, Jie Fan

We have developed an end-to-end, retrosynthesis system, named ChemiRise, that can propose complete retrosynthesis routes for organic compounds rapidly and reliably.

Retrosynthesis

Cannot find the paper you are looking for? You can Submit a new open access paper.