Learning Word Embeddings
23 papers with code • 0 benchmarks • 0 datasets
Benchmarks
These leaderboards are used to track progress in Learning Word Embeddings
Latest papers
Multi-Relational Hyperbolic Word Embeddings from Natural Language Definitions
Natural language definitions possess a recursive, self-explanatory semantic structure that can support representation learning methods able to preserve explicit conceptual relations and constraints in the latent space.
One Embedder, Any Task: Instruction-Finetuned Text Embeddings
Our analysis suggests that INSTRUCTOR is robust to changes in instructions, and that instruction finetuning mitigates the challenge of training a single model on diverse datasets.
MorphTE: Injecting Morphology in Tensorized Embeddings
In the era of deep learning, word embeddings are essential when dealing with text tasks.
ViCE: Improving Dense Representation Learning by Superpixelization and Contrasting Cluster Assignment
Recent self-supervised models have demonstrated equal or better performance than supervised methods, opening for AI systems to learn visual representations from practically unlimited data.
InfiniteWalk: Deep Network Embeddings as Laplacian Embeddings with a Nonlinearity
We study the objective in the limit as T goes to infinity, which allows us to simplify the expression of Qiu et al. We prove that this limiting objective corresponds to factoring a simple transformation of the pseudoinverse of the graph Laplacian, linking DeepWalk to extensive prior work in spectral graph embeddings.
Machine Translation with Cross-lingual Word Embeddings
Learning word embeddings using distributional information is a task that has been studied by many researchers, and a lot of studies are reported in the literature.
Neural Graph Embedding Methods for Natural Language Processing
Knowledge graphs are structured representations of facts in a graph, where nodes represent entities and edges represent relationships between them.
Towards Incremental Learning of Word Embeddings Using Context Informativeness
In this paper, we investigate the task of learning word embeddings from very sparse data in an incremental, cognitively-plausible way.
Few-Shot Representation Learning for Out-Of-Vocabulary Words
Existing approaches for learning word embeddings often assume there are sufficient occurrences for each word in the corpus, such that the representation of words can be accurately estimated from their contexts.
Variational Sequential Labelers for Semi-Supervised Learning
Our model family consists of a latent-variable generative model and a discriminative labeler.