1 code implementation • 17 Apr 2024 • Qi Han, Zhibo Tian, Chengwei Xia, Kun Zhan
Semi-supervised image classification, leveraging pseudo supervision and consistency regularization, has demonstrated remarkable success.
1 code implementation • NeurIPS 2023 • Qi Han, Yuxuan Cai, Xiangyu Zhang
Such design enables our architecture with the nice property: maintaining disentangled low-level and semantic information at the end of the network in MIM pre-training.
1 code implementation • 22 Dec 2022 • Yuxuan Cai, Yizhuang Zhou, Qi Han, Jianjian Sun, Xiangwen Kong, Jun Li, Xiangyu Zhang
Such architectural scheme attributes RevCol very different behavior from conventional networks: during forward propagation, features in RevCol are learned to be gradually disentangled when passing through each column, whose total information is maintained rather than compressed or discarded as other network does.
Ranked #8 on Semantic Segmentation on ADE20K (using extra training data)
2 code implementations • 14 Jun 2022 • ShangHua Gao, Zhong-Yu Li, Qi Han, Ming-Ming Cheng, Liang Wang
Our search scheme exploits both global search to find the coarse combinations and local search to get the refined receptive field combinations further.
Ranked #2 on Instance Segmentation on COCO 2017 val (AP metric)
no code implementations • 9 Aug 2021 • Xiangyan Sun, Ke Liu, Yuquan Lin, Lingjie Wu, Haoming Xing, Minghong Gao, Ji Liu, Suocheng Tan, Zekun Ni, Qi Han, Junqiu Wu, Jie Fan
We have developed an end-to-end, retrosynthesis system, named ChemiRise, that can propose complete retrosynthesis routes for organic compounds rapidly and reliably.
no code implementations • CVPR 2021 • Shang-Hua Gao, Qi Han, Duo Li, Ming-Ming Cheng, Pai Peng
We propose to add a simple yet effective feature calibration scheme into the centering and scaling operations of BatchNorm, enhancing the instance-specific representations with the negligible computational cost.
1 code implementation • ICLR 2022 • Qi Han, Zejia Fan, Qi Dai, Lei Sun, Ming-Ming Cheng, Jiaying Liu, Jingdong Wang
Sparse connectivity: there is no connection across channels, and each position is connected to the positions within a small local window.
no code implementations • 6 Mar 2021 • Ke Liu, Zekun Ni, Zhenyu Zhou, Suocheng Tan, Xun Zou, Haoming Xing, Xiangyan Sun, Qi Han, Junqiu Wu, Jie Fan
Molecular modeling is an important topic in drug discovery.
2 code implementations • CVPR 2021 • Shang-Hua Gao, Qi Han, Zhong-Yu Li, Pai Peng, Liang Wang, Ming-Ming Cheng
Our search scheme exploits both global search to find the coarse combinations and local search to get the refined receptive field combination patterns further.
Ranked #20 on Action Segmentation on Breakfast
2 code implementations • 25 Nov 2020 • Chang-Bin Zhang, Peng-Tao Jiang, Qibin Hou, Yunchao Wei, Qi Han, Zhen Li, Ming-Ming Cheng
Experiments demonstrate that based on the same classification models, the proposed approach can effectively improve the classification performance on CIFAR-100, ImageNet, and fine-grained datasets.
no code implementations • ECCV 2020 • Haibao Yu, Qi Han, Jianbo Li, Jianping Shi, Guangliang Cheng, Bin Fan
Learning to find an optimal mixed precision model that can preserve accuracy and satisfy the specific constraints on model size and computation is extremely challenge due to the difficult in training a mixed precision model and the huge space of all possible bit quantizations.
no code implementations • 6 May 2020 • Kai Zhao, Xin-Yu Zhang, Qi Han, Ming-Ming Cheng
Convolutional neural networks (CNNs) are typically over-parameterized, bringing considerable computational overhead and memory footprint in inference.
2 code implementations • ECCV 2020 • Kai Zhao, Qi Han, Chang-Bin Zhang, Jun Xu, Ming-Ming Cheng
In addition to the proposed method, we design an evaluation metric to assess the quality of line detection and construct a large scale dataset for the line detection task.
Ranked #2 on Line Detection on NKL
no code implementations • 7 Aug 2019 • Bin Guo, Huihui Chen, Yan Liu, Chao Chen, Qi Han, Zhiwen Yu
A generic model for CrowdMining is further proposed based on a set of existing studies.
no code implementations • 5 Aug 2019 • Haibao Yu, Tuopu Wen, Guangliang Cheng, Jiankai Sun, Qi Han, Jianping Shi
Low-bit quantization is challenging to maintain high performance with limited model capacity (e. g., 4-bit for both weights and activations).
no code implementations • LREC 2016 • Maximilian K{\"o}per, Melanie Zai{\ss}, Qi Han, Steffen Koch, Sabine Schulte im Walde
Vector space models and distributional information are widely used in NLP.