no code implementations • ICML 2020 • Jaesik Yoon, Gautam Singh, Sungjin Ahn
Meta-transfer learning seeks to improve the efficiency of learning a new task via both meta-learning and transfer-learning in a setting with a stream of evolving tasks.
no code implementations • 29 Feb 2024 • Hany Hamed, Subin Kim, Dongyeong Kim, Jaesik Yoon, Sungjin Ahn
The proposed agent realizes a version of divide-and-conquer-like strategy in dreaming.
no code implementations • 23 Feb 2024 • Junmo Cho, Jaesik Yoon, Sungjin Ahn
Adopting this approach, we demonstrate that memory utilization efficiency can be improved, leading to enhanced accuracy in various place-centric downstream tasks.
no code implementations • 9 Feb 2023 • Jaesik Yoon, Yi-Fu Wu, Heechul Bae, Sungjin Ahn
In this paper, we investigate the effectiveness of OCR pre-training for image-based reinforcement learning via empirical experiments.
no code implementations • 19 Feb 2022 • Chang Chen, Yi-Fu Wu, Jaesik Yoon, Sungjin Ahn
We then share this world model with a transformer-based policy network and obtain stability in training a transformer-based RL agent.
Model-based Reinforcement Learning reinforcement-learning +1
no code implementations • 20 Jul 2021 • Yi-Fu Wu, Jaesik Yoon, Sungjin Ahn
We compare our model with previous RNN-based approaches as well as other possible video transformer baselines.
no code implementations • 29 Jun 2020 • Jaesik Yoon, Gautam Singh, Sungjin Ahn
When tasks change over time, meta-transfer learning seeks to improve the efficiency of learning a new task via both meta-learning and transfer-learning.
no code implementations • 25 Sep 2019 • Jaesik Yoon, Gautam Singh, Sungjin Ahn
In this paper, we propose the Attentive Sequential Neural Processes (ASNP) that resolve the underfitting in SNP by introducing a novel imaginary context as a latent variable and by applying attention over the imaginary context.
1 code implementation • NeurIPS 2019 • Gautam Singh, Jaesik Yoon, Youngsung Son, Sungjin Ahn
In this paper, we propose Sequential Neural Processes (SNP) which incorporates a temporal state-transition model of stochastic processes and thus extends its modeling capabilities to dynamic stochastic processes.
no code implementations • 26 Apr 2019 • Dongjun Lee, Jaesik Yoon, Jongyun Song, Sang-gil Lee, Sungroh Yoon
We show that our model outperforms state-of-the-art approaches for various text-to-SQL datasets in two aspects: 1) the SQL generation accuracy for the trained templates, and 2) the adaptability to the unseen SQL templates based on a single example without any additional training.
2 code implementations • NeurIPS 2018 • Taesup Kim, Jaesik Yoon, Ousmane Dia, Sungwoong Kim, Yoshua Bengio, Sungjin Ahn
Learning to infer Bayesian posterior from a few-shot dataset is an important step towards robust meta-learning due to the model uncertainty inherent in the problem.
no code implementations • 16 Apr 2018 • Jaekoo Lee, Byunghan Lee, Jongyoon Song, Jaesik Yoon, Yongsik Lee, Dong-hun Lee, Sungroh Yoon
The experimental results with real-world data confirm the effectiveness of the system and models.