1 code implementation • 19 Sep 2023 • Taehyung Kwon, Jihoon Ko, Jinhong Jung, Kijung Shin
While many tensor compression algorithms are available, many of them rely on strong data assumptions regarding its order, sparsity, rank, and smoothness.
1 code implementation • 6 Jul 2023 • Geonwoo Ko, Jinhong Jung
In this paper, we propose DINES, a novel method for learning disentangled node representations in signed directed graphs without social assumptions.
1 code implementation • 9 Feb 2023 • Taehyung Kwon, Jihoon Ko, Jinhong Jung, Kijung Shin
The updates take time linear in the number of non-zeros in the input matrix, and the approximation of each entry can be retrieved in logarithmic time.
1 code implementation • 2 Nov 2022 • Jong-whi Lee, Jinhong Jung
How can we augment a dynamic graph for improving the performance of dynamic graph neural networks?
1 code implementation • 9 Jun 2022 • Jaemin Yoo, Hyunsik Jeon, Jinhong Jung, U Kang
Given a graph with partial observations of node features, how can we estimate the missing features accurately?
no code implementations • 28 Dec 2020 • Jinhong Jung, Jaemin Yoo, U Kang
In this paper, we propose Signed Graph Diffusion Network (SGDNet), a novel graph neural network that achieves end-to-end node representation learning for link sign prediction in signed social graphs.
no code implementations • 19 Dec 2020 • JaeHun Jung, Jinhong Jung, U Kang
However, most of the existing mod-els for TKG completion extend static KG embeddings that donot fully exploit TKG structure, thus lacking in 1) account-ing for temporally relevant events already residing in the lo-cal neighborhood of a query, and 2) path-based inference that facilitates multi-hop reasoning and better interpretability.
no code implementations • 9 Nov 2020 • Jinhong Jung, Lee Sael
How can we compute the pseudoinverse of a sparse feature matrix efficiently and accurately for solving optimization problems?