Word Embeddings

GloVe Embeddings

Introduced by Pennington et al. in GloVe: Global Vectors for Word Representation

GloVe Embeddings are a type of word embedding that encode the co-occurrence probability ratio between two words as vector differences. GloVe uses a weighted least squares objective $J$ that minimizes the difference between the dot product of the vectors of two words and the logarithm of their number of co-occurrences:

$$ J=\sum_{i, j=1}^{V}f\left(𝑋_{i j}\right)(w^{T}_{i}\tilde{w}_{j} + b_{i} + \tilde{b}_{j} - \log{𝑋}_{ij})^{2} $$

where $w_{i}$ and $b_{i}$ are the word vector and bias respectively of word $i$, $\tilde{w}_{j}$ and $b_{j}$ are the context word vector and bias respectively of word $j$, $X_{ij}$ is the number of times word $i$ occurs in the context of word $j$, and $f$ is a weighting function that assigns lower weights to rare and frequent co-occurrences.

Source: GloVe: Global Vectors for Word Representation

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Sentiment Analysis 34 6.60%
General Classification 29 5.63%
Sentence 28 5.44%
Text Classification 24 4.66%
Language Modelling 20 3.88%
Classification 18 3.50%
Machine Translation 16 3.11%
Question Answering 16 3.11%
Retrieval 14 2.72%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories