A Gegenbauer Neural Network with Regularized Weights Direct Determination for Classification

25 Oct 2019  ·  Jie He, Tao Chen, Zhijun Zhang ·

Single-hidden layer feed forward neural networks (SLFNs) are widely used in pattern classification problems, but a huge bottleneck encountered is the slow speed and poor performance of the traditional iterative gradient-based learning algorithms. Although the famous extreme learning machine (ELM) has successfully addressed the problems of slow convergence, it still has computational robustness problems brought by input weights and biases randomly assigned. Thus, in order to overcome the aforementioned problems, in this paper, a novel type neural network based on Gegenbauer orthogonal polynomials, termed as GNN, is constructed and investigated. This model could overcome the computational robustness problems of ELM, while still has comparable structural simplicity and approximation capability. Based on this, we propose a regularized weights direct determination (R-WDD) based on equality-constrained optimization to determine the optimal output weights. The R-WDD tends to minimize the empirical risks and structural risks of the network, thus to lower the risk of over fitting and improve the generalization ability. This leads us to a the final GNN with R-WDD, which is a unified learning mechanism for binary and multi-class classification problems. Finally, as is verified in the various comparison experiments, GNN with R-WDD tends to have comparable (or even better) generalization performances, computational scalability and efficiency, and classification robustness, compared to least square support vector machine (LS-SVM), ELM with Gaussian kernel.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods