Node Clustering
62 papers with code • 19 benchmarks • 14 datasets
Libraries
Use these libraries to find Node Clustering models and implementationsDatasets
Most implemented papers
Exploiting Node Content for Multiview Graph Convolutional Network and Adversarial Regularization
Network representation learning (NRL) is crucial in the area of graph learning.
Identity-aware Graph Neural Networks
However, the expressive power of existing GNNs is upper-bounded by the 1-Weisfeiler-Lehman (1-WL) graph isomorphism test, which means GNNs that are not able to predict node clustering coefficients and shortest path distances, and cannot differentiate between different d-regular graphs.
LIME: Low-Cost and Incremental Learning for Dynamic Heterogeneous Information Networks
To effectively trade information sharing for reduced memory footprint, we employ the recursive neural network (RsNN) with carefully designed optimization strategies to explore the node semantics in a novel cuboid space.
Accurate Learning of Graph Representations with Graph Multiset Pooling
Graph neural networks have been widely used on modeling graph data, achieving impressive results on node classification and link prediction tasks.
Deepened Graph Auto-Encoders Help Stabilize and Enhance Link Prediction
Graph neural networks have been used for a variety of learning tasks, such as link prediction, node classification, and node clustering.
Higher-Order Attribute-Enhancing Heterogeneous Graph Neural Networks
Furthermore, they cannot fully capture the content-based correlations between nodes, as they either do not use the self-attention mechanism or only use it to consider the immediate neighbors of each node, ignoring the higher-order neighbors.
Permutation-Invariant Variational Autoencoder for Graph-Level Representation Learning
In this work we address this issue by proposing a permutation-invariant variational autoencoder for graph structured data.
Unsupervised Deep Manifold Attributed Graph Embedding
Unsupervised attributed graph representation learning is challenging since both structural and feature information are required to be represented in the latent space.
Node Embedding using Mutual Information and Self-Supervision based Bi-level Aggregation
Graph Neural Networks (GNNs) learn low dimensional representations of nodes by aggregating information from their neighborhood in graphs.
Free Energy Node Embedding via Generalized Skip-gram with Negative Sampling
On the other hand, we propose a matrix factorization method based on a loss function that generalizes that of the skip-gram model with negative sampling to arbitrary similarity matrices.