1 code implementation • CVPR 2023 • Shun Lu, Yu Hu, Longxing Yang, Zihao Sun, Jilin Mei, Jianchao Tan, Chengru Song
Our method only requires negligible computation cost for optimizing the sampling distributions of path and data, but achieves lower gradient variance during supernet training and better generalization performance for the supernet, resulting in a more consistent NAS.
no code implementations • ICCV 2023 • Zihao Sun, Yu Sun, Longxing Yang, Shun Lu, Jilin Mei, Wenxiao Zhao, Yu Hu
Neural Architecture Search (NAS) aims to automatically find optimal neural network architectures in an efficient way.
1 code implementation • International Conference on Machine Learning 2022 • Zihao Sun, Yu Hu, Shun Lu, Longxing Yang, Jilin Mei, Yinhe Han, Xiaowei Li
We utilize the attention weights to represent the importance of the relevant operations for the micro search or the importance of the relevant blocks for the macro search.
1 code implementation • 27 Jan 2022 • Yang Zhao, Peng Guo, Zihao Sun, Xiuwan Chen, Han Gao
The performance of a semantic segmentation model for remote sensing (RS) images pretrained on an annotated dataset would greatly decrease when testing on another unannotated dataset because of the domain gap.
1 code implementation • BMVC 2021 • Shun Lu, Yu Hu, Longxing Yang, Zihao Sun, Jilin Mei, Yiming Zeng, Xiaowei Li
Differentiable Neural Architecture Search (DARTS) recently attracts a lot of research attention because of its high efficiency.
Ranked #9 on Neural Architecture Search on CIFAR-100