A Perceptual Distortion Reduction Framework: Towards Generating Adversarial Examples with High Perceptual Quality and Attack Success Rate

1 May 2021  ·  Ruijie Yang, Yunhong Wang, Ruikui Wang, Yuanfang Guo ·

Most of the adversarial attack methods suffer from large perceptual distortions such as visible artifacts, when the attack strength is relatively high. These perceptual distortions contain a certain portion which contributes less to the attack success rate. This portion of distortions, which is induced by unnecessary modifications and lack of proper perceptual distortion constraint, is the target of the proposed framework. In this paper, we propose a perceptual distortion reduction framework to tackle this problem from two perspectives. Firstly, we propose a perceptual distortion constraint and add it into the objective function to jointly optimize the perceptual distortions and attack success rate. Secondly, we propose an adaptive penalty factor $\lambda$ to balance the discrepancies between different samples. Since SGD and Momentum-SGD cannot optimize our complex non-convex problem, we exploit Adam in optimization. Extensive experiments have verified the superiority of our proposed framework.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods