no code implementations • 16 Oct 2023 • Taejong Joo, Diego Klabjan
Extensive experiments show the effectiveness of group accuracy estimation on model calibration and model selection.
no code implementations • 28 Sep 2020 • Taejong Joo, Uijung Chung
In this work, we revisit the role and importance of explicit regularization methods for generalization of the predictive probability, not just the generalization of the 0-1 loss.
no code implementations • 11 Jun 2020 • Taejong Joo, Uijung Chung
From the statistical learning perspective, complexity control via explicit regularization is a necessity for improving the generalization of over-parameterized models.
1 code implementation • ICML 2020 • Taejong Joo, Uijung Chung, Min-Gwan Seo
Neural networks utilize the softmax as a building block in classification tasks, which contains an overconfidence problem and lacks an uncertainty representation ability.
no code implementations • ICLR 2020 • Taejong Joo, Donggu Kang, Byunghoon Kim
We propose the projected error function regularization loss (PER) that encourages activations to follow the standard normal distribution.