Paper

Unsupervised Domain Adaptation for Device-free Gesture Recognition

Device free human gesture recognition with Radio Frequency signals has attained acclaim due to the omnipresence, privacy protection, and broad coverage nature of RF signals. However, neural network models trained for recognition with data collected from a specific domain suffer from significant performance degradation when applied to a new domain. To tackle this challenge, we propose an unsupervised domain adaptation framework for device free gesture recognition by making effective use of the unlabeled target domain data. Specifically, we apply pseudo labeling and consistency regularization with elaborated design on target domain data to produce pseudo labels and align instance feature of the target domain. Then, we design two data augmentation methods by randomly erasing the input data to enhance the robustness of the model. Furthermore, we apply a confidence control constraint to tackle the overconfidence problem. We conduct extensive experiments on a public WiFi dataset and a public millimeter wave radar dataset. The experimental results demonstrate the superior effectiveness of the proposed framework.

Results in Papers With Code
(↓ scroll down to see all results)