Search Results for author: Hyeonmin Ha

Found 2 papers, 1 papers with code

SUMNAS: Supernet with Unbiased Meta-Features for Neural Architecture Search

no code implementations ICLR 2022 Hyeonmin Ha, Ji-Hoon Kim, Semin Park, Byung-Gon Chun

We propose Supernet with Unbiased Meta-Features for Neural Architecture Search (SUMNAS), a supernet learning strategy based on meta-learning to tackle the knowledge forgetting issue.

Computational Efficiency Meta-Learning +1

Parallax: Automatic Data-Parallel Training of Deep Neural Networks

1 code implementation8 Aug 2018 Soojeong Kim, Gyeong-In Yu, Hojin Park, Sungwoo Cho, Eunji Jeong, Hyeonmin Ha, Sanha Lee, Joo Seong Jeong, Byung-Gon Chun

The employment of high-performance servers and GPU accelerators for training deep neural network models have greatly accelerated recent advances in machine learning (ML).

Distributed, Parallel, and Cluster Computing

Cannot find the paper you are looking for? You can Submit a new open access paper.