no code implementations • 5 Feb 2024 • Jaerin Lee, JoonKyu Park, Sungyong Baik, Kyoung Mu Lee
Image restoration models are typically trained with a pixel-wise distance loss defined over the RGB color representation space, which is well known to be a source of blurry and unrealistic textures in the restored images.
no code implementations • ICCV 2023 • Eunhye Lee, Jinsu Yoo, Yunjeong Yang, Sungyong Baik, Tae Hyun Kim
Recent learning-based video inpainting approaches have achieved considerable progress.
no code implementations • 16 Dec 2022 • JaeYoung Chung, Kanggeon Lee, Sungyong Baik, Kyoung Mu Lee
Under such incremental learning scenarios, neural networks are known to suffer catastrophic forgetting: easily forgetting previously seen data after training with new data.
1 code implementation • 21 Jul 2022 • Cheeun Hong, Sungyong Baik, Heewon Kim, Seungjun Nah, Kyoung Mu Lee
In this work, to achieve high average bit-reduction with less accuracy loss, we propose a novel Content-Aware Dynamic Quantization (CADyQ) method for SR networks that allocates optimal bits to local regions and layers adaptively based on the local contents of an input image.
no code implementations • 2 Dec 2021 • Junghun Oh, Heewon Kim, Sungyong Baik, Cheeun Hong, Kyoung Mu Lee
The goal of filter pruning is to search for unimportant filters to remove in order to make convolutional neural networks (CNNs) efficient without sacrificing the performance in the process.
1 code implementation • ICCV 2021 • Sungyong Baik, Janghoon Choi, Heewon Kim, Dohee Cho, Jaesik Min, Kyoung Mu Lee
The problem lies in that each application and task may require different auxiliary loss function, especially when tasks are diverse and distinct.
2 code implementations • 21 Dec 2020 • Cheeun Hong, Heewon Kim, Sungyong Baik, Junghun Oh, Kyoung Mu Lee
Quantizing deep convolutional neural networks for image super-resolution substantially reduces their computational costs.
no code implementations • ICCV 2021 • Heewon Kim, Sungyong Baik, Myungsub Choi, Janghoon Choi, Kyoung Mu Lee
Diverse user preferences over images have recently led to a great amount of interest in controlling the imagery effects for image restoration tasks.
2 code implementations • NeurIPS 2020 • Sungyong Baik, Myungsub Choi, Janghoon Choi, Heewon Kim, Kyoung Mu Lee
Despite its popularity, several recent works question the effectiveness of MAML when test tasks are different from training tasks, thus suggesting various task-conditioned methodology to improve the initialization.
no code implementations • 21 Aug 2020 • Sungyong Baik, Hyo Jin Kim, Tianwei Shen, Eddy Ilg, Kyoung Mu Lee, Chris Sweeney
We tackle the problem of visual localization under changing conditions, such as time of day, weather, and seasons.
1 code implementation • CVPR 2020 • Myungsub Choi, Janghoon Choi, Sungyong Baik, Tae Hyun Kim, Kyoung Mu Lee
Finally, we show that our meta-learning framework can be easily employed to any video frame interpolation network and can consistently improve its performance on multiple benchmark datasets.
1 code implementation • CVPR 2020 • Sungyong Baik, Seokil Hong, Kyoung Mu Lee
Model-agnostic meta-learning (MAML) tackles the problem by formulating prior knowledge as a common initialization across tasks, which is then used to quickly adapt to unseen tasks.