Input Embedding Factorization

Adaptive Input Representations

Introduced by Baevski et al. in Adaptive Input Representations for Neural Language Modeling

Adaptive Input Embeddings extend the adaptive softmax to input word representations. The factorization assigns more capacity to frequent words and reduces the capacity for less frequent words with the benefit of reducing overfitting to rare words.

Source: Adaptive Input Representations for Neural Language Modeling

Papers


Paper Code Results Date Stars

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories