Network Embedding
153 papers with code • 0 benchmarks • 4 datasets
Network Embedding, also known as "Network Representation Learning", is a collective term for techniques for mapping graph nodes to vectors of real numbers in a multidimensional space. To be useful, a good embedding should preserve the structure of the graph. The vectors can then be used as input to various network and graph analysis tasks, such as link prediction
Benchmarks
These leaderboards are used to track progress in Network Embedding
Libraries
Use these libraries to find Network Embedding models and implementationsLatest papers
Representation Learning on Heterostructures via Heterogeneous Anonymous Walks
Capturing structural similarity has been a hot topic in the field of network embedding recently due to its great help in understanding the node functions and behaviors.
TME-BNA: Temporal Motif-Preserving Network Embedding with Bicomponent Neighbor Aggregation
Evolving temporal networks serve as the abstractions of many real-life dynamic systems, e. g., social network and e-commerce.
Vaccine skepticism detection by network embedding
It is even more difficult to understand all the reasoning why vax-skeptic opinions are getting more popular.
Multi-Relation Aware Temporal Interaction Network Embedding
However, existing temporal interaction network embedding methods only use historical interaction relations to mine neighbor nodes, ignoring other relation types.
Signed Bipartite Graph Neural Networks
Signed bipartite networks are different from classical signed networks, which contain two different node sets and signed links between two node sets.
Temporal Graph Network Embedding with Causal Anonymous Walks Representations
For evaluation, we provide a benchmark pipeline for the evaluation of temporal network embeddings.
SiReN: Sign-Aware Recommendation Using Graph Neural Networks
In recent years, many recommender systems using network embedding (NE) such as graph neural networks (GNNs) have been extensively studied in the sense of improving recommendation accuracy.
TextCNN with Attention for Text Classification
By using WordRank for vocabulary selection we can reduce the number of parameters by more than 5x from 7. 9M to 1. 5M, and the accuracy will only decrease by 1. 2%.
A Survey on Role-Oriented Network Embedding
A wide variety of NE methods focus on the proximity of networks.
Discrete-time Temporal Network Embedding via Implicit Hierarchical Learning in Hyperbolic Space
To explore these properties of a complex temporal network, we propose a hyperbolic temporal graph network (HTGN) that fully takes advantage of the exponential capacity and hierarchical awareness of hyperbolic geometry.