Search Results for author: Shaoyun Xu

Found 4 papers, 0 papers with code

Attention Round for Post-Training Quantization

no code implementations7 Jul 2022 Huabin Diao, Gongyan Li, Shaoyun Xu, Yuexing Hao

For ResNet18 and MobileNetV2, the post-training quantization proposed in this paper only require 1, 024 training data and 10 minutes to complete the quantization process, which can achieve quantization performance on par with quantization aware training.

Combinatorial Optimization Quantization

Memory-Based Label-Text Tuning for Few-Shot Class-Incremental Learning

no code implementations3 Jul 2022 Jinze Li, Yan Bai, Yihang Lou, Xiongkun Linghu, Jianzhong He, Shaoyun Xu, Tao Bai

The difficulties are that training on a sequence of limited data from new tasks leads to severe overfitting issues and causes the well-known catastrophic forgetting problem.

Few-Shot Class-Incremental Learning Incremental Learning

Fastidious Attention Network for Navel Orange Segmentation

no code implementations26 Mar 2020 Xiaoye Sun, Gongyan Li, Shaoyun Xu

Deep learning achieves excellent performance in many domains, so we not only apply it to the navel orange semantic segmentation task to solve the two problems of distinguishing defect categories and identifying the stem end and blossom end, but also propose a fastidious attention mechanism to further improve model performance.

Semantic Segmentation

FSD: Feature Skyscraper Detector for Stem End and Blossom End of Navel Orange

no code implementations24 May 2019 Xiaoye Sun, Gongyan Li, Shaoyun Xu

To accurately and efficiently distinguish the stem end and the blossom end of navel orange from its black spots, we propose a feature skyscraper detector (FSD) with low computational cost, compact architecture and high detection accuracy.

Clustering

Cannot find the paper you are looking for? You can Submit a new open access paper.