no code implementations • 20 Mar 2024 • Joo Yong Shim, Jean Seong Bjorn Choe, Jong-Kook Kim
This article proposes auction-inspired multi-player generative adversarial networks training, which mitigates the mode collapse problem of GANs.
no code implementations • 19 Mar 2024 • Eunjee Choi, Jong-Kook Kim
This paper introduces an end-to-end model called TT-BLIP that applies the bootstrapping language-image pretraining for unified vision-language understanding and generation (BLIP) for three types of information: BERT and BLIP\textsubscript{Txt} for text, ResNet and BLIP\textsubscript{Img} for images, and bidirectional BLIP encoders for multimodal information.
no code implementations • 28 Apr 2023 • Sungwoo Kang, Jong-Kook Kim
The backtest is done using the data from 2020 to 2022.
no code implementations • 20 Jul 2022 • HaeChun Chung, JooYong Shim, Jong-Kook Kim
Multiple modalities for certain information provide a variety of perspectives on that information, which can improve the understanding of the information.
no code implementations • 24 Oct 2021 • XinYu Piao, DoangJoo Synn, JooYoung Park, Jong-Kook Kim
This method helps deep learning models to train by providing a batch streaming method that splits a batch into a size that can fit in the remaining memory and streams them sequentially.
no code implementations • 16 Aug 2021 • Yunseok Kwak, Won Joon Yun, Soyi Jung, Jong-Kook Kim, Joongheon Kim
The emergence of quantum computing enables for researchers to apply quantum circuit on many existing studies.
1 code implementation • 17 Feb 2021 • Dongjae Kim, Jong-Kook Kim
The skip-gram (SG) model learns word representation by predicting the words surrounding a center word from unstructured text data.
no code implementations • 9 Jan 2020 • Joohyung Jeon, Junhui Kim, Joongheon Kim, Kwangsoo Kim, Aziz Mohaisen, Jong-Kook Kim
This paper proposes a distributed deep learning framework for privacy-preserving medical data training.