Search Results for author: Ikhyun Cho

Found 5 papers, 0 papers with code

ViT-MUL: A Baseline Study on Recent Machine Unlearning Methods Applied to Vision Transformers

no code implementations7 Feb 2024 Ikhyun Cho, Changyeon Park, Julia Hockenmaier

Machine unlearning (MUL) is an arising field in machine learning that seeks to erase the learned information of specific training data points from a trained model.

Machine Unlearning

Attack and Reset for Unlearning: Exploiting Adversarial Noise toward Machine Unlearning through Parameter Re-initialization

no code implementations17 Jan 2024 Yoonhwa Jung, Ikhyun Cho, Shun-Hsiang Hsu, Julia Hockenmaier

With growing concerns surrounding privacy and regulatory compliance, the concept of machine unlearning has gained prominence, aiming to selectively forget or erase specific learned information from a trained model.

Machine Unlearning

Pea-KD: Parameter-efficient and Accurate Knowledge Distillation on BERT

no code implementations30 Sep 2020 Ikhyun Cho, U Kang

PTP is a KD-specialized initialization method, which can act as a good initial guide for the student.

Knowledge Distillation Model Compression

Pea-KD: Parameter-efficient and accurate Knowledge Distillation

no code implementations28 Sep 2020 Ikhyun Cho, U Kang

SPS is a new parameter sharing method that allows greater model complexity for the student model.

Knowledge Distillation Model Compression

Cannot find the paper you are looking for? You can Submit a new open access paper.