no code implementations • ECCV 2020 • Haifeng Xia, Zhengming Ding
Domain Adaptation as an important tool aims to explore a generalized model trained on well-annotated source knowledge to address learning issue on target domain with insufficient or even no annotation.
no code implementations • 30 Apr 2024 • Zhendong Liu, Haifeng Xia, Tong Guo, Libo Sun, Ming Shao, Siyu Xia
In addition, a dedicated temporal convolution is applied at each level to learn short-term temporal features, which will be carried over from shallow to deep layers to maximize the leverage of low-level details.
no code implementations • 29 Apr 2024 • Xiangyu Liang, Wenlin Zhuang, Tianyong Wang, Guangxing Geng, Guangyue Geng, Haifeng Xia, Siyu Xia
Speech-driven 3D facial animation technology has been developed for years, but its practical application still lacks expectations.
no code implementations • 29 Apr 2024 • Tianyong Wang, Xiangyu Liang, Wangguandong Zheng, Dan Niu, Haifeng Xia, Siyu Xia
The talking head generation recently attracted considerable attention due to its widespread application prospects, especially for digital avatars and 3D animation design.
no code implementations • 14 Apr 2024 • Haifeng Xia, Hai Huang, Zhengming Ding
Deep clustering as an important branch of unsupervised representation learning focuses on embedding semantically similar samples into the identical feature space.
no code implementations • 2 Apr 2024 • Wangguandong Zheng, Haifeng Xia, Rui Chen, Ming Shao, Siyu Xia, Zhengming Ding
Recently, image-to-3D approaches have achieved significant results with a natural image as input.
no code implementations • ICCV 2023 • Haifeng Xia, Kai Li, Zhengming Ding
Federated learning casts a light on the collaboration of distributed local clients with privacy protected to attain a more generic global model.
no code implementations • ICCV 2023 • Haifeng Xia, Kai Li, Martin Renqiang Min, Zhengming Ding
This operation maximizes the contribution of discriminative frames to further capture the similarity of support and query samples from the same category.
no code implementations • 20 Sep 2022 • Haifeng Xia, Pu, Wang, Toshiaki Koike-Akino, Ye Wang, Philip Orlik, Zhengming Ding
Domain adaptation (DA) aims to transfer the knowledge of a well-labeled source domain to facilitate unlabeled target learning.
no code implementations • 29 Sep 2021 • Haifeng Xia, Taotao Jing, Zizhan Zheng, Zhengming Ding
Unsupervised domain adaptation (UDA) aims to transfer knowledge from one or more well-labeled source domains to improve model performance on the different-yet-related target domain without any annotations.
no code implementations • ICCV 2021 • Haifeng Xia, Handong Zhao, Zhengming Ding
Unsupervised Domain Adaptation solves knowledge transfer along with the coexistence of well-annotated source domain and unlabeled target instances.
no code implementations • 1 Jan 2021 • Haifeng Xia, Taotao Jing, Zhengming Ding
Batch Normalization (BN) as an important component assists Deep Neural Networks achieving promising performance for extensive learning tasks by scaling distribution of feature representations within mini-batches.
no code implementations • 27 Aug 2020 • Taotao Jing, Haifeng Xia, Zhengming Ding
Partial domain adaptation (PDA) attracts appealing attention as it deals with a realistic and challenging problem when the source domain label space substitutes the target domain.
no code implementations • CVPR 2020 • Haifeng Xia, Zhengming Ding
Unsupervised domain adaptation (UDA) casts a light when dealing with insufficient or no labeled data in the target domain by exploring the well-annotated source knowledge in different distributions.
no code implementations • 12 Feb 2020 • Guanglei Yang, Haifeng Xia, Mingli Ding, Zhengming Ding
To balance the mitigation of domain gap and the preservation of the inherent structure, we propose a Bi-Directional Generation domain adaptation model with consistent classifiers interpolating two intermediate domains to bridge source and target domains.
no code implementations • NeurIPS 2016 • Hong Chen, Haifeng Xia, Heng Huang, Weidong Cai
Nystr\"{o}m method has been used successfully to improve the computational efficiency of kernel ridge regression (KRR).