no code implementations • 6 Oct 2020 • Pietro Verzelli, Cesare Alippi, Lorenzo Livi
In recent years, the machine learning community has seen a continuous growing interest in research aimed at investigating dynamical aspects of both training procedures and machine learning models.
no code implementations • 24 Mar 2020 • Pietro Verzelli, Cesare Alippi, Lorenzo Livi, Peter Tino
Reservoir computing is a popular approach to design recurrent neural networks, due to its training simplicity and approximation performance.
1 code implementation • 27 Mar 2019 • Pietro Verzelli, Cesare Alippi, Lorenzo Livi
Finding such a region requires searching in hyper-parameter space in a sensible way: hyper-parameter configurations marginally outside such a region might yield networks exhibiting fully developed chaos, hence producing unreliable computations.
no code implementations • 3 Oct 2018 • Pietro Verzelli, Lorenzo Livi, Cesare Alippi
Echo State Networks (ESNs) are simplified recurrent neural network models composed of a reservoir and a linear, trainable readout layer.