Sentence-Embedding
125 papers with code • 1 benchmarks • 2 datasets
Benchmarks
These leaderboards are used to track progress in Sentence-Embedding
Trend | Dataset | Best Model | Paper | Code | Compare |
---|
Libraries
Use these libraries to find Sentence-Embedding models and implementationsMost implemented papers
On the Sentence Embeddings from Pre-trained Language Models
Pre-trained contextual representations like BERT have achieved great success in natural language processing.
Learning Semantic Sentence Embeddings using Sequential Pair-wise Discriminator
One way to ensure this is by adding constraints for true paraphrase embeddings to be close and unrelated paraphrase candidate sentence embeddings to be far.
Neural Sentence Embedding using Only In-domain Sentences for Out-of-domain Sentence Detection in Dialog Systems
Then we used domain-category analysis as an auxiliary task to train neural sentence embedding for OOD sentence detection.
Learning to Embed Sentences Using Attentive Recursive Trees
Sentence embedding is an effective feature representation for most deep learning-based NLP tasks.
Sentence Embedding Alignment for Lifelong Relation Extraction
We formulate such a challenging problem as lifelong relation extraction and investigate memory-efficient incremental learning methods without catastrophically forgetting knowledge learned from previous tasks.
A Bilingual Generative Transformer for Semantic Sentence Embedding
Semantic sentence embedding models encode natural language sentences into vectors, such that closeness in embedding space indicates closeness in the semantics between the sentences.
ESimCSE: Enhanced Sample Building Method for Contrastive Learning of Unsupervised Sentence Embedding
Unsup-SimCSE takes dropout as a minimal data augmentation method, and passes the same input sentence to a pre-trained Transformer encoder (with dropout turned on) twice to obtain the two corresponding embeddings to build a positive pair.
Smoothed Contrastive Learning for Unsupervised Sentence Embedding
Contrastive learning has been gradually applied to learn high-quality unsupervised sentence embedding.
InfoCSE: Information-aggregated Contrastive Learning of Sentence Embeddings
Contrastive learning has been extensively studied in sentence embedding learning, which assumes that the embeddings of different views of the same sentence are closer.
miCSE: Mutual Information Contrastive Learning for Low-shot Sentence Embeddings
This study opens up avenues for efficient self-supervised learning methods that are more robust than current contrastive methods for sentence embedding.