Restricted Minimum Error Entropy Criterion for Robust Classification

6 Sep 2019  ·  Yuanhao Li, Badong Chen, Natsue Yoshimura, Yasuharu Koike ·

The minimum error entropy (MEE) criterion has been verified as a powerful approach for non-Gaussian signal processing and robust machine learning. However, the implementation of MEE on robust classification is rather a vacancy in the literature. The original MEE only focuses on minimizing the Renyi's quadratic entropy of the error probability distribution function (PDF), which could cause failure in noisy classification tasks. To this end, we analyze the optimal error distribution in the presence of outliers for those classifiers with continuous errors, and introduce a simple codebook to restrict MEE so that it drives the error PDF towards the desired case. Half-quadratic based optimization and convergence analysis of the new learning criterion, called restricted MEE (RMEE), are provided. Experimental results with logistic regression and extreme learning machine are presented to verify the desirable robustness of RMEE.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods