Wrapped Loss Function for Regularizing Nonconforming Residual Distributions

21 Aug 2018  ·  Chun Ting Liu, Ming Chuan Yang, Meng Chang Chen ·

Multi-output is essential in machine learning that it might suffer from nonconforming residual distributions, i.e., the multi-output residual distributions are not conforming to the expected distribution. In this paper, we propose "Wrapped Loss Function" to wrap the original loss function to alleviate the problem. This wrapped loss function acts just like the original loss function that its gradient can be used for backpropagation optimization. Empirical evaluations show wrapped loss function has advanced properties of faster convergence, better accuracy, and improving imbalanced data.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here