Understanding Global Loss Landscape of One-hidden-layer ReLU Networks, Part 1: Theory

12 Feb 2020  ·  Bo Liu ·

For one-hidden-layer ReLU networks, we prove that all differentiable local minima are global inside differentiable regions. We give the locations and losses of differentiable local minima, and show that these local minima can be isolated points or continuous hyperplanes, depending on an interplay between data, activation pattern of hidden neurons and network size. Furthermore, we give necessary and sufficient conditions for the existence of saddle points as well as non-differentiable local minima, and their locations if they exist.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods