Search Results for author: Chanho Min

Found 2 papers, 0 papers with code

Minimum width for universal approximation using ReLU networks on compact domain

no code implementations19 Sep 2023 Namjun Kim, Chanho Min, Sejun Park

We next prove a lower bound on $w_{\min}$ for uniform approximation using general activation functions including ReLU: $w_{\min}\ge d_y+1$ if $d_x<d_y\le2d_x$.

Deep Collective Knowledge Distillation

no code implementations18 Apr 2023 Jihyeon Seo, Kyusam Oh, Chanho Min, Yongkeun Yun, Sungwoo Cho

We propose deep collective knowledge distillation for model compression, called DCKD, which is a method for training student models with rich information to acquire knowledge from not only their teacher model but also other student models.

Knowledge Distillation Model Compression

Cannot find the paper you are looking for? You can Submit a new open access paper.