1 code implementation • 2 Mar 2024 • Yeongmin Kim, Byeonghu Na, Minsang Park, JoonHo Jang, Dongjun Kim, Wanmo Kang, Il-Chul Moon
While directly applying it to score-matching is intractable, we discover that using the time-dependent density ratio both for reweighting and score correction can lead to a tractable form of the objective function to regenerate the unbiased data density.
1 code implementation • Proceedings of the 40th International Conference on Machine Learning 2023 • Yoon-Yeong Kim, Youngjae Cho, JoonHo Jang, Byeonghu Na, Yeongmin Kim, Kyungwoo Song, Wanmo Kang, Il-Chul Moon
Specifically, our proposed method, Sharpness-Aware Active Learning (SAAL), constructs its acquisition function by selecting unlabeled instances whose perturbed loss becomes maximum.
1 code implementation • 15 Jun 2022 • JoonHo Jang, Byeonghu Na, DongHyeok Shin, Mingi Ji, Kyungwoo Song, Il-Chul Moon
Therefore, we propose Unknown-Aware Domain Adversarial Learning (UADAL), which $\textit{aligns}$ the source and the target-$\textit{known}$ distribution while simultaneously $\textit{segregating}$ the target-$\textit{unknown}$ distribution in the feature alignment procedure.
2 code implementations • 2 May 2022 • HeeSun Bae, Seungjae Shin, Byeonghu Na, JoonHo Jang, Kyungwoo Song, Il-Chul Moon
We suggest a new branch of method, Noisy Prediction Calibration (NPC) in learning with noisy labels.
1 code implementation • NeurIPS 2021 • Yooon-Yeong Kim, Kyungwoo Song, JoonHo Jang, Il-Chul Moon
Active learning effectively collects data instances for training deep learning models when the labeled dataset is limited and the annotation cost is high.
no code implementations • 11 Mar 2021 • JoonHo Jang, Heun Mo Yoo, Loren N. Pfeiffer, Kenneth W. West, K. W. Baldwin, Raymond C. Ashoori
With fully tunable densities of individual layers, the floating bilayer QW system provides a versatile platform to access previously unavailable information on the quantum phases in electron bilayer systems.
Mesoscale and Nanoscale Physics
no code implementations • 24 Nov 2020 • Hyemi Kim, Seungjae Shin, JoonHo Jang, Kyungwoo Song, Weonyoung Joo, Wanmo Kang, Il-Chul Moon
Therefore, this paper proposes Disentangled Causal Effect Variational Autoencoder (DCEVAE) to resolve this limitation by disentangling the exogenous uncertainty into two latent variables: either 1) independent to interventions or 2) correlated to interventions without causality.
no code implementations • NeurIPS 2021 • Yoon-Yeong Kim, Kyungwoo Song, JoonHo Jang, Il-Chul Moon
Active learning effectively collects data instances for training deep learning models when the labeled dataset is limited and the annotation cost is high.
no code implementations • Findings of the Association for Computational Linguistics 2020 • Seungjae Shin, Kyungwoo Song, JoonHo Jang, Hyemi Kim, Weonyoung Joo, Il-Chul Moon
Recent research demonstrates that word embeddings, trained on the human-generated corpus, have strong gender biases in embedding spaces, and these biases can result in the discriminative results from the various downstream tasks.
no code implementations • 7 Apr 2020 • Seungjae Shin, Kyungwoo Song, JoonHo Jang, Hyemi Kim, Weonyoung Joo, Il-Chul Moon
Recent research demonstrates that word embeddings, trained on the human-generated corpus, have strong gender biases in embedding spaces, and these biases can result in the discriminative results from the various downstream tasks.
1 code implementation • 25 May 2019 • Kyungwoo Song, JoonHo Jang, Seung jae Shin, Il-Chul Moon
Long Short-Term Memory (LSTM) infers the long term dependency through a cell state maintained by the input and the forget gate structures, which models a gate output as a value in [0, 1] through a sigmoid function.