Browse SoTA > Methodology > Word Embeddings > Learning Word Embeddings

Learning Word Embeddings

11 papers with code · Methodology

Benchmarks

No evaluation results yet. Help compare methods by submit evaluation metrics.

Latest papers without code

In Neural Machine Translation, What Does Transfer Learning Transfer?

ACL 2020

Transfer learning improves quality for low-resource machine translation, but it is unclear what exactly it transfers.

LEARNING WORD EMBEDDINGS MACHINE TRANSLATION TRANSFER LEARNING

Apprentissage de plongements de mots sur des corpus en langue de sp\'ecialit\'e : une \'etude d'impact (Learning word embeddings on domain specific corpora : an impact study )

JEPTALNRECITAL 2020

Pour r{\'e}pondre {\`a} cette question, nous consid{\'e}rons deux corpus en langue de sp{\'e}cialit{\'e} : O HSUMED issu du domaine m{\'e}dical, et un corpus de documentation technique, propri{\'e}t{\'e} de SNCF.

LEARNING WORD EMBEDDINGS

InfiniteWalk: Deep Network Embeddings as Laplacian Embeddings with a Nonlinearity

29 May 2020

We study the objective in the limit as T goes to infinity, which allows us to simplify the expression of Qiu et al. We prove that this limiting objective corresponds to factoring a simple transformation of the pseudoinverse of the graph Laplacian, linking DeepWalk to extensive prior work in spectral graph embeddings.

LEARNING WORD EMBEDDINGS MULTI-LABEL CLASSIFICATION

Learning Cross-Context Entity Representations from Text

11 Jan 2020

Language modeling tasks, in which words, or word-pieces, are predicted on the basis of a local context, have been very effective for learning word embeddings and context dependent representations of phrases.

ENTITY LINKING LANGUAGE MODELLING LEARNING WORD EMBEDDINGS

Machine Translation with Cross-lingual Word Embeddings

10 Dec 2019

Learning word embeddings using distributional information is a task that has been studied by many researchers, and a lot of studies are reported in the literature.

LEARNING WORD EMBEDDINGS MACHINE TRANSLATION

Neural Graph Embedding Methods for Natural Language Processing

8 Nov 2019

Knowledge graphs are structured representations of facts in a graph, where nodes represent entities and edges represent relationships between them.

GRAPH EMBEDDING KNOWLEDGE GRAPHS LEARNING WORD EMBEDDINGS LINK PREDICTION RELATION EXTRACTION

Learning Word Embeddings without Context Vectors

WS 2019

Most word embedding algorithms such as word2vec or fastText construct two sort of vectors: for words and for contexts.

LEARNING WORD EMBEDDINGS

Towards Incremental Learning of Word Embeddings Using Context Informativeness

ACL 2019

In this paper, we investigate the task of learning word embeddings from very sparse data in an incremental, cognitively-plausible way.

INCREMENTAL LEARNING LEARNING WORD EMBEDDINGS

Learning Word Embeddings with Domain Awareness

7 Jun 2019

Word embeddings are traditionally trained on a large corpus in an unsupervised setting, with no specific design for incorporating domain knowledge.

LEARNING WORD EMBEDDINGS

Words are Vectors, Dependencies are Matrices: Learning Word Embeddings from Dependency Graphs

WS 2019

Distributional Semantic Models (DSMs) construct vector representations of word meanings based on their contexts.

LEARNING WORD EMBEDDINGS