The goal of Time Series Prediction is to infer the future values of a time series from the past.
We propose POLA (Predicting Online by Learning rate Adaptation) to automatically regulate the learning rate of recurrent neural network models to adapt to changing time series patterns across time.
Hyperparameter optimization has remained a central topic within the machine learning community due to its ability to produce state-of-the-art results.
This paper aims to establish a framework for extreme learning machines (ELMs) on general hypercomplex algebras.
Reservoir Computing (RC) is an appealing approach in Machine Learning that combines the high computational capabilities of Recurrent Neural Networks with a fast and easy training method.
Railway systems require regular manual maintenance, a large part of which is dedicated to track deformation inspection.
We propose a continuous neural network architecture, termed Explainable Tensorized Neural Ordinary Differential Equations (ETN-ODE), for multi-step time series prediction at arbitrary time points.
This learning approach does not require backpropagating the output error to learn the premise parts' parameters.
This work introduces a novel, nature-inspired neural architecture search (NAS) algorithm based on ant colony optimization, Continuous Ant-based Neural Topology Search (CANTS), which utilizes synthetic ants that move over a continuous search space based on the density and distribution of pheromones, is strongly inspired by how ants move in the real world.