Paper

On the uncertainty principle of neural networks

Despite the successes in many fields, it is found that neural networks are difficult to be both accurate and robust, i.e., high accuracy networks are often vulnerable. Various empirical and analytic studies have substantiated that there is more or less a trade-off between the accuracy and robustness of neural networks. If the property is inherent, applications based on the neural networks are vulnerable with untrustworthy predictions. To more deeply explore and understand this issue, in this study we show that the accuracy-robustness trade-off is an intrinsic property whose underlying mechanism is closely related to the uncertainty principle in quantum mechanics. By relating the loss function in neural networks to the wave function in quantum mechanics, we show that the inputs and their conjugates cannot be resolved by a neural network simultaneously. This work thus provides an insightful explanation for the inevitability of the accuracy-robustness dilemma for general deep networks from an entirely new perspective, and furthermore, reveals a potential possibility to study various properties of neural networks with the mature mathematical tools in quantum physics.

Results in Papers With Code
(↓ scroll down to see all results)