no code implementations • 29 Dec 2023 • Jie Shen, Shusen Yang, Cong Zhao, Xuebin Ren, Peng Zhao, Yuqian Yang, Qing Han, Shuaijun Wu
Intelligent equipment fault diagnosis based on Federated Transfer Learning (FTL) attracts considerable attention from both academia and industry.
1 code implementation • 14 Dec 2023 • Guiqin Wang, Peng Zhao, Yanjiang Shi, Cong Zhao, Shusen Yang
Addressing this gap, our paper introduces an innovative knowledge distillation framework, with the generative model for training a lightweight student model.
no code implementations • 20 Oct 2023 • Tobias Deußer, Cong Zhao, Wolfgang Krämer, David Leonhard, Christian Bauckhage, Rafet Sifa
During the pre-training step of natural language models, the main objective is to learn a general representation of the pre-training dataset, usually requiring large amounts of textual data to capture the complexity and diversity of natural language.
no code implementations • ICCV 2023 • Guiqin Wang, Peng Zhao, Cong Zhao, Shusen Yang, Jie Cheng, Luziwei Leng, Jianxing Liao, Qinghai Guo
To address this problem, we propose a novel attention-based hierarchically-structured latent model to learn the temporal variations of feature semantics.
1 code implementation • 24 Mar 2022 • Luhui Wang, Cong Zhao, Shusen Yang, Xinyu Yang, Julie McCann
Intelligent applications based on machine learning are impacting many parts of our lives.
no code implementations • 24 Jul 2020 • Jing Chen, Chenhui Wang, Kejun Wang, Chaoqun Yin, Cong Zhao, Tao Xu, Xinyi Zhang, Ziqiang Huang, Meichen Liu, Tao Yang
Existing multimodal emotion databases in the real-world conditions are few and small, with a limited number of subjects and expressed in a single language.
no code implementations • 4 May 2020 • Yuanrui Dong, Peng Zhao, Hanqiao Yu, Cong Zhao, Shusen Yang
The emerging edge-cloud collaborative Deep Learning (DL) paradigm aims at improving the performance of practical DL implementations in terms of cloud bandwidth consumption, response latency, and data privacy preservation.
no code implementations • 22 Apr 2020 • Qing Han, Shusen Yang, Xuebin Ren, Cong Zhao, Jingqi Zhang, Xinyu Yang
However, heterogeneous and limited computation and communication resources on edge servers (or edges) pose great challenges on distributed ML and formulate a new paradigm of Edge Learning (i. e. edge-cloud collaborative machine learning).
no code implementations • 17 Dec 2019 • Yanan Li, Shusen Yang, Xuebin Ren, Cong Zhao
Formally, we give the first analysis on the model convergence of AFL under DP and propose a multi-stage adjustable private algorithm (MAPA) to improve the trade-off between model utility and privacy by dynamically adjusting both the noise scale and the learning rate.
no code implementations • 7 May 2019 • Yang Jiang, Cong Zhao, Zeyang Dou, Lei Pang
Based on this correlation, we further demonstrate that, though the reward of NAS is sparse, the policy gradient method implicitly assign the reward to all operations and skip connections based on the sampling frequency.
no code implementations • 22 Aug 2018 • Yadan Luo, Ziwei Wang, Zi Huang, Yang Yang, Cong Zhao
Rich high-quality annotated data is critical for semantic segmentation learning, yet acquiring dense and pixel-wise ground-truth is both labor- and time-consuming.
2 code implementations • NeurIPS 2017 • Xiaofan Lin, Cong Zhao, Wei Pan
We introduce a novel scheme to train binary convolutional neural networks (CNNs) -- CNNs with weights and activations constrained to {-1,+1} at run-time.