Search Results for author: Dae Ha Kim

Found 2 papers, 1 papers with code

Self-supervised Knowledge Distillation Using Singular Value Decomposition

3 code implementations ECCV 2018 Seung Hyun Lee, Dae Ha Kim, Byung Cheol Song

To solve deep neural network (DNN)'s huge training dataset and its high computation issue, so-called teacher-student (T-S) DNN which transfers the knowledge of T-DNN to S-DNN has been proposed.

Knowledge Distillation Transfer Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.