Privacy-Enhancing Face Obfuscation Guided by Semantic-Aware Attribution Maps

Face recognition technology is increasingly being integrated into our daily life, e.g. Face ID. With the advancement of machine learning algorithms, the personal information such as age, gender, and race can be easily deduced from the recorded face images in these applications. This poses a serious privacy threat to individuals who do not want to be profiled, as face images are collected for biometric purposes. Existing methods mostly focus on adding the invisible adversarial perturbations into the images to make automatic inference infeasible. However, the application scenarios of these methods are limited due to the perturbations depending on the specific model. In this paper, we introduce a novel face privacy-enhancing framework by obfuscating the stored faces, which could maintain the data utility (face identity) while protecting the privacy of users (facial attributes). Specifically, we first develop a feature attribution module to discover the identity-related facial parts. Within this module, we introduce a pixel importance estimation model based on Shapley value to obtain a pixel-level attribution map, and then each pixel on the attribution map is aggregated into semantic facial parts, which are used to quantify the importance of different facial parts. Next, we design a privacy-enhancing module to generate the high-quality obfuscated images, which can modify the privacy semantic content and preserve the identity-related information. Using the proposed method, users can choose the single or multiple attributes to be obfuscated without affecting identity matching. Extensive experiments conducted on CelebA-HQ and VGGFace2-HQ benchmarks demonstrate the effectiveness and generalization ability of our method.

PDF
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods