Search Results for author: Yeachan Kim

Found 11 papers, 2 papers with code

Improving Bias Mitigation through Bias Experts in Natural Language Understanding

1 code implementation6 Dec 2023 Eojin Jeon, Mingyu Lee, Juhyeong Park, Yeachan Kim, Wing-Lam Mok, SangKeun Lee

To mitigate the detrimental effect of the bias on the networks, previous works have proposed debiasing methods that down-weight the biased examples identified by an auxiliary model, which is trained with explicit bias labels.

Binary Classification Multi-class Classification +1

Learning From Drift: Federated Learning on Non-IID Data via Drift Regularization

no code implementations13 Sep 2023 Yeachan Kim, Bonggun Shin

In this work, we carefully analyze the existing methods in heterogeneous environments.

Federated Learning

Dynamic Structure Pruning for Compressing CNNs

1 code implementation17 Mar 2023 Jun-Hyung Park, Yeachan Kim, Junho Kim, Joon-Young Choi, SangKeun Lee

In this work, we introduce a novel structure pruning method, termed as dynamic structure pruning, to identify optimal pruning granularities for intra-channel pruning.

Phase-shifted Adversarial Training

no code implementations12 Jan 2023 Yeachan Kim, Seongyeon Kim, Ihyeok Seo, Bonggun Shin

Comprehensive results show that PhaseAT significantly improves the convergence for high-frequency information.

Adversarial Robustness

In Defense of Core-set: A Density-aware Core-set Selection for Active Learning

no code implementations10 Jun 2022 Yeachan Kim, Bonggun Shin

The strategy is to estimate the density of the unlabeled samples and select diverse samples mainly from sparse regions.

Active Learning

Context-based Virtual Adversarial Training for Text Classification with Noisy Labels

no code implementations LREC 2022 Do-Myoung Lee, Yeachan Kim, Chang-gyun Seo

In this paper, we propose context-based virtual adversarial training (ConVAT) to prevent a text classifier from overfitting to noisy labels.

Memorization text-classification +1

An Interpretable Framework for Drug-Target Interaction with Gated Cross Attention

no code implementations17 Sep 2021 Yeachan Kim, Bonggun Shin

In silico prediction of drug-target interactions (DTI) is significant for drug discovery because it can largely reduce timelines and costs in the drug development process.

Drug Discovery

Adaptive Compression of Word Embeddings

no code implementations ACL 2020 Yeachan Kim, Kang-Min Kim, SangKeun Lee

However, unlike prior works that assign the same length of codes to all words, we adaptively assign different lengths of codes to each word by learning downstream tasks.

Self-Driving Cars Word Embeddings

Representation Learning for Unseen Words by Bridging Subwords to Semantic Networks

no code implementations LREC 2020 Yeachan Kim, Kang-Min Kim, SangKeun Lee

In the first stage, we learn subword embeddings from the pre-trained word embeddings by using an additive composition function of subwords.

Representation Learning Word Embeddings

Learning to Generate Word Representations using Subword Information

no code implementations COLING 2018 Yeachan Kim, Kang-Min Kim, Ji-Min Lee, SangKeun Lee

Unlike previous models that learn word representations from a large corpus, we take a set of pre-trained word embeddings and generalize it to word entries, including OOV words.

Chunking Language Modelling +5

Cannot find the paper you are looking for? You can Submit a new open access paper.