no code implementations • 22 Nov 2023 • Seongyoon Kim, Gihun Lee, Jaehoon Oh, Se-Young Yun
Additionally, we observe that as data heterogeneity increases, the gap between higher feature norms for observed classes, obtained from local models, and feature norms of unobserved classes widens, in contrast to the behavior of classifier weight norms.
no code implementations • 31 Mar 2023 • Seongyoon Kim, Hangsoon Jung, Minho Lee, Yun Young Choi, Jung-Il Choi
The method involves predicting a few knots at specific retention levels using a deep learning-based model and interpolating them to reconstruct the trajectory.
1 code implementation • 31 May 2022 • Sumyeong Ahn, Seongyoon Kim, Se-Young Yun
In this study, we propose a debiasing algorithm, called PGD (Per-sample Gradient-based Debiasing), that comprises three steps: (1) training a model on uniform batch sampling, (2) setting the importance of each sample in proportion to the norm of the sample gradient, and (3) training the model using importance-batch sampling, whose probability is obtained in step (2).
no code implementations • 2 Jul 2021 • Seongyoon Kim, Yun Young Choi, Jung-Il Choi
This paper proposes a fully unsupervised methodology for the reliable extraction of latent variables representing the characteristics of lithium-ion batteries (LIBs) from electrochemical impedance spectroscopy (EIS) data using information maximizing generative adversarial networks.