no code implementations • 5 Oct 2023 • Jonatan Vallin, Karl Larsson, Mats G. Larson
This leads to a geometric interpretation of a ReLU layer as a projection onto a polyhedral cone followed by an affine transformation, in line with the description in [doi:10. 48550/arXiv. 1905. 08922] for convolutional networks with ReLU activations.