no code implementations • 1 May 2024 • Chaejeong Lee, Jeongwhan Choi, Hyowon Wi, Sung-Bae Cho, Noseong Park
In this paper, we propose a novel Stochastic sampling for i) COntrastive views and ii) hard NEgative samples (SCONE) to overcome these issues.
no code implementations • 6 Jan 2024 • Jeongwhan Choi, Duksan Ryu
We propose a novel approach called QoS-aware graph contrastive learning (QAGCL) for web service recommendation.
no code implementations • 27 Dec 2023 • Jeongwhan Choi, Hyowon Wi, Chaejeong Lee, Sung-Bae Cho, Dongha Lee, Noseong Park
Contrastive learning (CL) has emerged as a promising technique for improving recommender systems, addressing the challenge of data sparsity by leveraging self-supervised signals from raw data.
1 code implementation • 19 Dec 2023 • Youn-Yeol Yu, Jeongwhan Choi, Woojin Cho, Kookjin Lee, Nayong Kim, Kiseok Chang, Chang-Seung Woo, Ilho Kim, Seok-Woo Lee, Joon-Young Yang, Sooyoung Yoon, Noseong Park
These methods are typically designed to i) reduce the computational cost in solving physical dynamics and/or ii) propose techniques to enhance the solution accuracy in fluid and rigid body dynamics.
2 code implementations • 16 Dec 2023 • Yehjin Shin, Jeongwhan Choi, Hyowon Wi, Noseong Park
In the SR domain, we, for the first time, show that the same problem occurs.
Ranked #1 on Sequential Recommendation on MovieLens 1M
no code implementations • 12 Dec 2023 • Jayoung Kim, Yehjin Shin, Jeongwhan Choi, Hyowon Wi, Noseong Park
Structured data, which constitutes a significant portion of existing data types, has been a long-standing research topic in the field of machine learning.
no code implementations • 7 Dec 2023 • Jeongwhan Choi, Hyowon Wi, Jayoung Kim, Yehjin Shin, Kookjin Lee, Nathaniel Trask, Noseong Park
Transformers, renowned for their self-attention mechanism, have achieved state-of-the-art performance across various tasks in natural language processing, computer vision, time-series modeling, etc.
no code implementations • 8 Nov 2023 • Seonkyu Lim, Jaehyeon Park, Seojin Kim, Hyowon Wi, Haksoo Lim, Jinsung Jeon, Jeongwhan Choi, Noseong Park
Long-term time series forecasting (LTSF) is a challenging task that has been investigated in various domains such as finance investment, health care, traffic, and weather forecasting.
2 code implementations • 20 Mar 2023 • Jeongwhan Choi, Noseong Park
A prevalent approach in the field is to combine graph convolutional networks and recurrent neural networks for the spatio-temporal processing.
Ranked #2 on Traffic Prediction on PeMSD7(L)
1 code implementation • 25 Nov 2022 • Jeongwhan Choi, Seoyoung Hong, Noseong Park, Sung-Bae Cho
In particular, diffusion equations have been widely used for designing the core processing layer of GNNs, and therefore they are inevitably vulnerable to the notorious oversmoothing problem.
no code implementations • 22 Nov 2022 • Jaehoon Lee, Chan Kim, Gyumin Lee, Haksoo Lim, Jeongwhan Choi, Kookjin Lee, Dongeun Lee, Sanghyun Hong, Noseong Park
Forecasting future outcomes from recent time series data is not easy, especially when the future data are different from the past (i. e. time series are under temporal drifts).
1 code implementation • 17 Nov 2022 • Jeongwhan Choi, Seoyoung Hong, Noseong Park, Sung-Bae Cho
Various methods have been proposed for collaborative filtering, ranging from matrix factorization to graph convolutional methods.
Ranked #1 on Collaborative Filtering on Gowalla
2 code implementations • 30 Aug 2022 • Seoyoung Hong, Heejoo Shin, Jeongwhan Choi, Noseong Park
Owing to the continuous and bijective characteristics of NODEs, in addition, we design a one-shot price optimization method given a pre-trained prediction model, which requires only one iteration to find the optimal solution.
1 code implementation • 7 Dec 2021 • Jeongwhan Choi, Hwangyong Choi, Jeehyun Hwang, Noseong Park
A prevalent approach in the field is to combine graph convolutional networks and recurrent neural networks for the spatio-temporal processing.
Ranked #3 on Traffic Prediction on PeMSD7(L)
2 code implementations • 14 Nov 2021 • Taeyong Kong, Taeri Kim, Jinsung Jeon, Jeongwhan Choi, Yeon-Chang Lee, Noseong Park, Sang-Wook Kim
To our knowledge, we are the first who design a hybrid method and report the correlation between the graph centrality and the linearity/non-linearity of nodes.
2 code implementations • 11 Nov 2021 • Jeehyun Hwang, Jeongwhan Choi, Hwangyong Choi, Kookjin Lee, Dongeun Lee, Noseong Park
On the other hand, neural ordinary differential equations (NODEs) are to learn a latent governing equation of ODE from data.
2 code implementations • 8 Aug 2021 • Jeongwhan Choi, Jinsung Jeon, Noseong Park
In this work, we extend them based on neural ordinary differential equations (NODEs), because the linear GCN concept can be interpreted as a differential equation, and present the method of Learnable-Time ODE-based Collaborative Filtering (LT-OCF).
Ranked #1 on Recommendation Systems on Amazon-book