no code implementations • 4 Feb 2024 • YongDeok Kim, Jaehyung Ahn, Myeongwoo Kim, Changin Choi, Heejae Kim, Narankhuu Tuvshinjargal, Seungwon Lee, Yanzi Zhang, Yuan Pei, Xiongzhan Linghu, Jingkun Ma, Lin Chen, Yuehua Dai, Sungjoo Yoo
Speeding up the large-scale distributed training is challenging in that it requires improving various components of training including load balancing, communication, optimizers, etc.
no code implementations • 1 Feb 2024 • Hyunyoung Jung, Seonghyeon Nam, Nikolaos Sarafianos, Sungjoo Yoo, Alexander Sorkine-Hornung, Rakesh Ranjan
Shape and geometric patterns are essential in defining stylistic identity.
no code implementations • 12 Nov 2023 • Han-Byul Kim, Joo Hyung Lee, Sungjoo Yoo, Hong-Seok Kim
Mixed-precision quantization of efficient networks often suffer from activation instability encountered in the exploration of bit selections.
no code implementations • 3 Oct 2023 • Jongmin Lee, Yohann Cabon, Romain Brégier, Sungjoo Yoo, Jerome Revaud
Existing learning-based methods for object pose estimation in RGB images are mostly model-specific or category based.
no code implementations • CVPR 2023 • Hyunyoung Jung, Zhuo Hui, Lei Luo, Haitao Yang, Feng Liu, Sungjoo Yoo, Rakesh Ranjan, Denis Demandolx
To apply optical flow in practice, it is often necessary to resize the input to smaller dimensions in order to reduce computational costs.
1 code implementation • ECCV(European Conference on Computer Vision) 2022 • Han-Byul Kim, Eunhyeok Park, Sungjoo Yoo
In this paper, we propose Branch-wise Activation-clipping Search Quantization (BASQ), which is a novel quantization method for low-bit activation.
no code implementations • 4 Jul 2022 • Namwoo Lee, Hyunsu Kim, Gayoung Lee, Sungjoo Yoo, Yunjey Choi
However, training existing approaches require a heavy computational cost proportional to the image resolution, since they compute an MLP operation for every (x, y) coordinate.
1 code implementation • CVPR 2023 • JunCheol Shin, Junhyuk So, Sein Park, Seungyeop Kang, Sungjoo Yoo, Eunhyeok Park
Recently, pseudoquantization training has been proposed as an alternative approach to updating the learnable parameters using the pseudo-quantization noise instead of STE.
no code implementations • 29 Dec 2021 • Sungmin Cho, Hongjun Lim, Keunchan Park, Sungjoo Yoo, Eunhyeok Park
Personalized news recommendation aims to provide attractive articles for readers by predicting their likelihood of clicking on a certain article.
2 code implementations • ICCV 2021 • Hyunyoung Jung, Eunhyeok Park, Sungjoo Yoo
Self-supervised monocular depth estimation has been widely studied, owing to its practical importance and recent promising improvements.
1 code implementation • CVPR 2021 • Hyunsu Kim, Yunjey Choi, Junho Kim, Sungjoo Yoo, Youngjung Uh
Although manipulating the latent vectors controls the synthesized outputs, editing real images with GANs suffers from i) time-consuming optimization for projecting real images to the latent vectors, ii) or inaccurate embedding through an encoder.
no code implementations • 1 Jan 2021 • Hyunsu Kim, Yunjey Choi, Junho Kim, Sungjoo Yoo, Youngjung Uh
State-of-the-art GAN-based methods for editing real images suffer from time-consuming operations in projecting real images to latent vectors.
1 code implementation • 19 Aug 2020 • Sung Min Cho, Eunhyeok Park, Sungjoo Yoo
Following the custom from language processing, most of these models rely on a simple positional embedding to exploit the sequential nature of the user's history.
1 code implementation • ECCV 2020 • Eunhyeok Park, Sungjoo Yoo
In the ablation study of the 3-bit quantization of MobileNet-v3, our proposed method outperforms the state-of-the-art method by a large margin, 12. 86 % of top-1 accuracy.
2 code implementations • ICCV 2019 • Hyunsu Kim, Ho Young Jhoo, Eunhyeok Park, Sungjoo Yoo
A GAN approach is proposed, called Tag2Pix, of line art colorization which takes as input a grayscale line art and color tag information and produces a quality colored image.
no code implementations • ICLR 2019 • Eunhyeok Park, Dongyoung Kim, Sungjoo Yoo, Peter Vajda
We also report that the proposed method significantly outperforms the existing method in the 2-bit quantization of an LSTM for language modeling.
no code implementations • 24 Nov 2018 • Jongsoo Park, Maxim Naumov, Protonu Basu, Summer Deng, Aravind Kalaiah, Daya Khudia, James Law, Parth Malani, Andrey Malevich, Satish Nadathur, Juan Pino, Martin Schatz, Alexander Sidorov, Viswanath Sivakumar, Andrew Tulloch, Xiaodong Wang, Yiming Wu, Hector Yuen, Utku Diril, Dmytro Dzhulgakov, Kim Hazelwood, Bill Jia, Yangqing Jia, Lin Qiao, Vijay Rao, Nadav Rotem, Sungjoo Yoo, Mikhail Smelyanskiy
The application of deep learning techniques resulted in remarkable improvement of machine learning models.
no code implementations • ECCV 2018 • Eunhyeok Park, Sungjoo Yoo, Peter Vajda
We propose a novel value-aware quantization which applies aggressively reduced precision to the majority of data while separately handling a small amount of large data in high precision, which reduces total quantization errors under very low precision.
no code implementations • CVPR 2017 • Eunhyeok Park, Junwhan Ahn, Sungjoo Yoo
Quantization is considered as one of the most effective methods to optimize the inference cost of neural network models for their deployment to mobile and embedded systems, which have tight resource constraints.
7 code implementations • 20 Nov 2015 • Yong-Deok Kim, Eunhyeok Park, Sungjoo Yoo, Taelim Choi, Lu Yang, Dongjun Shin
Although the latest high-end smartphone has powerful CPU and GPU, running deeper convolutional neural networks (CNNs) for complex tasks such as ImageNet classification on mobile devices is challenging.