no code implementations • 19 Apr 2023 • Seongho Joe, Byoungjip Kim, Hoyoung Kang, Kyoungwon Park, Bogun Kim, Jaeseon Park, Joonseok Lee, Youngjune Gwon
The recent advances in representation learning inspire us to take on the challenging problem of unsupervised image classification tasks in a principled way.
no code implementations • 19 Apr 2023 • Joonseok Lee, Seongho Joe, Kyoungwon Park, Bogun Kim, Hoyoung Kang, Jaeseon Park, Youngjune Gwon
We propose a self-supervised learning method for long text documents based on contrastive learning.
1 code implementation • 13 Jul 2022 • Jinho Choo, Yeong-Dae Kwon, Jihoon Kim, Jeongwoo Jae, André Hottung, Kevin Tierney, Youngjune Gwon
Neural approaches for combinatorial optimization (CO) equip a learning mechanism to discover powerful heuristics for solving complex real-world problems.
no code implementations • 16 Aug 2021 • Yonghyun Jeong, Doyeon Kim, Seungjai Min, Seongho Joe, Youngjune Gwon, Jongwon Choi
The advancement in numerous generative models has a two-fold effect: a simple and easy generation of realistic synthesized images, but also an increased risk of malicious abuse of those images.
1 code implementation • ICCV 2021 • Jooyoung Choi, Sungwon Kim, Yonghyun Jeong, Youngjune Gwon, Sungroh Yoon
In this work, we propose Iterative Latent Variable Refinement (ILVR), a method to guide the generative process in DDPM to generate high-quality images based on a given reference image.
1 code implementation • NeurIPS 2021 • Yeong-Dae Kwon, Jinho Choo, Iljoo Yoon, Minah Park, Duwon Park, Youngjune Gwon
A popular approach is to use a neural net to compute on the parameters of a given CO problem and extract useful information that guides the search for good solutions.
no code implementations • 27 Jan 2021 • Hyunjae Lee, Jaewoong Yoon, Bonggyu Hwang, Seongho Joe, Seungjai Min, Youngjune Gwon
A Lite BERT (ALBERT) has been introduced to scale up deep bidirectional representation learning for natural languages.
no code implementations • 26 Jan 2021 • Hyunjin Choi, Judong Kim, Seongho Joe, Seungjai Min, Youngjune Gwon
In zero-shot cross-lingual transfer, a supervised NLP task trained on a corpus in one language is directly applicable to another language without any additional training.
no code implementations • 26 Jan 2021 • Hyunjin Choi, Judong Kim, Seongho Joe, Youngjune Gwon
The pre-trained BERT and A Lite BERT (ALBERT) models can be fine-tuned to give state-ofthe-art results in sentence-pair regressions such as semantic textual similarity (STS) and natural language inference (NLI).
no code implementations • 16 Jan 2021 • Byoungjip Kim, Jinho Choo, Yeong-Dae Kwon, Seongho Joe, Seungjai Min, Youngjune Gwon
This paper introduces SelfMatch, a semi-supervised learning method that combines the power of contrastive self-supervised learning and consistency regularization.
2 code implementations • NeurIPS 2020 • Yeong-Dae Kwon, Jinho Choo, Byoungjip Kim, Iljoo Yoon, Youngjune Gwon, Seungjai Min
We introduce Policy Optimization with Multiple Optima (POMO), an end-to-end approach for building such a heuristic solver.
no code implementations • CVPR 2021 • Jongwon Choi, Kwang Moo Yi, Ji-Hoon Kim, Jinho Choo, Byoungjip Kim, Jin-Yeop Chang, Youngjune Gwon, Hyung Jin Chang
We show that our method can be applied to classification tasks on multiple different datasets -- including one that is a real-world dataset with heavy data imbalance -- significantly outperforming the state of the art.
1 code implementation • 4 Mar 2020 • Yonghyun Jeong, Hyunjin Choi, Byoungjip Kim, Youngjune Gwon
We propose DefogGAN, a generative approach to the problem of inferring state information hidden in the fog of war for real-time strategy (RTS) games.
no code implementations • 5 Sep 2017 • Miriam Cha, Youngjune Gwon, H. T. Kung
We argue that clustering with word embeddings in the metric space should yield feature representations in a higher semantic space appropriate for text regression.
no code implementations • 30 Aug 2017 • Miriam Cha, Youngjune Gwon, H. T. Kung
Recent approaches in generative adversarial networks (GANs) can automatically synthesize realistic images from descriptive text.
no code implementations • 17 May 2016 • Youngjune Gwon, William Campbell, Kevin Brady, Douglas Sturim, Miriam Cha, H. T. Kung
Unsupervised feature learning methods have proven effective for classification tasks based on a single modality.
no code implementations • 19 Nov 2015 • Miriam Cha, Youngjune Gwon, H. T. Kung
In this paper, we present a multimodal framework for learning sparse representations that can capture semantic correlation between modalities.