Search Results for author: Donggon Jang

Found 3 papers, 1 papers with code

Maximizing Discrimination Capability of Knowledge Distillation with Energy Function

no code implementations24 Nov 2023 Seonghak Kim, Gyeongdo Ham, SuIn Lee, Donggon Jang, Daeshik Kim

To distill optimal knowledge by adjusting non-target class predictions, we apply a higher temperature to low energy samples to create smoother distributions and a lower temperature to high energy samples to achieve sharper distributions.

Data Augmentation Knowledge Distillation

Unsupervised Image Denoising with Frequency Domain Knowledge

1 code implementation29 Nov 2021 Nahyun Kim, Donggon Jang, Sunhyeok Lee, Bomi Kim, Dae-shik Kim

Supervised learning-based methods yield robust denoising results, yet they are inherently limited by the need for large-scale clean/noisy paired datasets.

Generative Adversarial Network Image Denoising

Cannot find the paper you are looking for? You can Submit a new open access paper.