Multilayer Fisher extreme learning machine for classification

As a special deep learning algorithm, the multilayer extreme learning machine (ML-ELM) has been extensively studied to solve practical problems in recent years. TheML-ELM is constructed from the extreme learning machine autoencoder (ELM- AE), and its generalization performance is affected by the representation learning of the ELM-AE. However, given label information, the unsupervised learning of the ELM-AE is difficult to build the discriminative feature space for classification tasks. To address this problem, a novel Fisher extreme learning machine autoencoder (FELM-AE) is proposed and is used as the component for the multilayer Fisher extreme leaning machine (ML-FELM). The FELM-AE introduces the Fisher criterion into the ELM-AE by adding the Fisher regularization term to the objective function, aiming to maximize the between-class distance and minimize the within-class distance of abstract feature. Different from the ELM-AE, the FELM-AE requires class labels to calculate the Fisher regularization loss, so that the learned abstract feature contains sufficient category information to complete classification tasks. The ML-FELM stacks the FELM-AE to extract feature and adopts the extreme leaning machine (ELM) to classify samples. Experiments on benchmark datasets show that the abstract feature extracted by the FELM-AE is more discriminative than the ELM-AE, and the classification results of the ML-FELM are more competitive and robust in comparison with the ELM, one-dimensional convolutional neural network (1D-CNN), ML-ELM, denoising multilayer extreme learning machine (D-ML-ELM), multilayer generalized extreme learning machine (ML-GELM), and hierarchical extreme learning machine with L21-norm loss and regularization (H-LR21-ELM).

PDF

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods