Search Results for author: Nicolas Martin

Found 2 papers, 0 papers with code

Under what circumstances do local codes emerge in feed-forward neural networks

no code implementations25 Sep 2019 Ella M. Gale, Nicolas Martin

Pseudo-deep networks (2 hidden layers) which have many LCs lose them when common aspects of deep-NN research are applied (large training data, ReLU activations, early stopping on training accuracy and softmax), suggesting that LCs may not be found in deep-NNs.

When and where do feed-forward neural networks learn localist representations?

no code implementations ICLR 2018 Ella M. Gale, Nicolas Martin, Jeffrey S. Bowers

We find that the number of local codes that emerge from a NN follows a well-defined distribution across the number of hidden layer neurons, with a peak determined by the size of input data, number of examples presented and the sparsity of input data.

Cannot find the paper you are looking for? You can Submit a new open access paper.