no code implementations • 17 Apr 2024 • Dingkun Zhang, Sijia Li, Chen Chen, Qingsong Xie, Haonan Lu
To this end, we proposed the layer pruning and normalized distillation for compressing diffusion models (LAPTOP-Diff).
no code implementations • 3 Mar 2024 • Hongjian Liu, Qingsong Xie, Zhijie Deng, Chen Chen, Shixiang Tang, Fueyang Fu, Zheng-Jun Zha, Haonan Lu
In contrast to vanilla consistency distillation (CD) which distills the ordinary differential equation solvers-based sampling process of a pretrained teacher model into a student, SCott explores the possibility and validates the efficacy of integrating stochastic differential equation (SDE) solvers into CD to fully unleash the potential of the teacher.
1 code implementation • 28 Nov 2023 • Jian Ma, Chen Chen, Qingsong Xie, Haonan Lu
In this paper, we are inspired to propose a simple plug-and-play language transfer method based on knowledge distillation.
Cross-lingual Text-to-Image Generation Knowledge Distillation +1
1 code implementation • 13 Jun 2023 • Weizhen He, Yiheng Deng, Shixiang Tang, Qihao Chen, Qingsong Xie, Yizhou Wang, Lei Bai, Feng Zhu, Rui Zhao, Wanli Ouyang, Donglian Qi, Yunfeng Yan
This paper strives to resolve this problem by proposing a new instruct-ReID task that requires the model to retrieve images according to the given image or language instructions.
1 code implementation • CVPR 2023 • Shixiang Tang, Cheng Chen, Qingsong Xie, Meilin Chen, Yizhou Wang, Yuanzheng Ci, Lei Bai, Feng Zhu, Haiyang Yang, Li Yi, Rui Zhao, Wanli Ouyang
Specifically, we propose a \textbf{HumanBench} based on existing datasets to comprehensively evaluate on the common ground the generalization abilities of different pretraining methods on 19 datasets from 6 diverse downstream tasks, including person ReID, pose estimation, human parsing, pedestrian attribute recognition, pedestrian detection, and crowd counting.
Ranked #1 on Pedestrian Attribute Recognition on PA-100K (using extra training data)