Sentence Embedding

132 papers with code • 0 benchmarks • 7 datasets

This task has no description! Would you like to contribute one?

Libraries

Use these libraries to find Sentence Embedding models and implementations

Most implemented papers

On the Sentence Embeddings from Pre-trained Language Models

bohanli/BERT-flow EMNLP 2020

Pre-trained contextual representations like BERT have achieved great success in natural language processing.

Learning Semantic Sentence Embeddings using Sequential Pair-wise Discriminator

dev-chauhan/PQG-pytorch COLING 2018

One way to ensure this is by adding constraints for true paraphrase embeddings to be close and unrelated paraphrase candidate sentence embeddings to be far.

Neural Sentence Embedding using Only In-domain Sentences for Out-of-domain Sentence Detection in Dialog Systems

BevoLEt/Neural-sentence-embedding 27 Jul 2018

Then we used domain-category analysis as an auxiliary task to train neural sentence embedding for OOD sentence detection.

Context Mover's Distance & Barycenters: Optimal Transport of Contexts for Building Representations

sidak/context-mover-distance-and-barycenters 29 Aug 2018

We present a framework for building unsupervised representations of entities and their compositions, where each entity is viewed as a probability distribution rather than a vector embedding.

Learning to Embed Sentences Using Attentive Recursive Trees

shijx12/AR-Tree 6 Nov 2018

Sentence embedding is an effective feature representation for most deep learning-based NLP tasks.

Sentence Embedding Alignment for Lifelong Relation Extraction

hongwang600/Lifelong_Relation_Detection NAACL 2019

We formulate such a challenging problem as lifelong relation extraction and investigate memory-efficient incremental learning methods without catastrophically forgetting knowledge learned from previous tasks.

Discovering the Compositional Structure of Vector Representations with Role Learning Networks

psoulos/role-decomposition EMNLP (BlackboxNLP) 2020

How can neural networks perform so well on compositional tasks even though they lack explicit compositional representations?

A Bilingual Generative Transformer for Semantic Sentence Embedding

jwieting/bilingual-generative-transformer EMNLP 2020

Semantic sentence embedding models encode natural language sentences into vectors, such that closeness in embedding space indicates closeness in the semantics between the sentences.

ESimCSE: Enhanced Sample Building Method for Contrastive Learning of Unsupervised Sentence Embedding

caskcsg/sentemb COLING 2022

Unsup-SimCSE takes dropout as a minimal data augmentation method, and passes the same input sentence to a pre-trained Transformer encoder (with dropout turned on) twice to obtain the two corresponding embeddings to build a positive pair.

Smoothed Contrastive Learning for Unsupervised Sentence Embedding

caskcsg/sentemb COLING 2022

Contrastive learning has been gradually applied to learn high-quality unsupervised sentence embedding.