Sentence Embeddings

219 papers with code • 0 benchmarks • 11 datasets

This task has no description! Would you like to contribute one?

Libraries

Use these libraries to find Sentence Embeddings models and implementations

Latest papers with no code

Enhancing Cross-lingual Sentence Embedding for Low-resource Languages with Word Alignment

no code yet • 3 Apr 2024

The field of cross-lingual sentence embeddings has recently experienced significant advancements, but research concerning low-resource languages has lagged due to the scarcity of parallel corpora.

Semantically Enriched Cross-Lingual Sentence Embeddings for Crisis-related Social Media Texts

no code yet • 25 Mar 2024

Tasks such as semantic search and clustering on crisis-related social media texts enhance our comprehension of crisis discourse, aiding decision-making and targeted interventions.

Evaluating Unsupervised Dimensionality Reduction Methods for Pretrained Sentence Embeddings

no code yet • 20 Mar 2024

Sentence embeddings produced by Pretrained Language Models (PLMs) have received wide attention from the NLP community due to their superior performance when representing texts in numerous downstream applications.

Adaptative Bilingual Aligning Using Multilingual Sentence Embedding

no code yet • 18 Mar 2024

In this paper, we present an adaptive bitextual alignment system called AIlign.

RobustSentEmbed: Robust Sentence Embeddings Using Adversarial Self-Supervised Contrastive Learning

no code yet • 17 Mar 2024

In this paper, we introduce RobustSentEmbed, a self-supervised sentence embedding framework designed to improve both generalization and robustness in diverse text representation tasks and against a diverse set of adversarial attacks.

To Label or Not to Label: Hybrid Active Learning for Neural Machine Translation

no code yet • 14 Mar 2024

A weighted hybrid score that combines uncertainty and diversity is then used to select the top instances for annotation in each AL iteration.

Hyper-CL: Conditioning Sentence Representations with Hypernetworks

no code yet • 14 Mar 2024

While the introduction of contrastive learning frameworks in sentence representation learning has significantly contributed to advancements in the field, it still remains unclear whether state-of-the-art sentence embeddings can capture the fine-grained semantics of sentences, particularly when conditioned on specific perspectives.

Cross-lingual Transfer or Machine Translation? On Data Augmentation for Monolingual Semantic Textual Similarity

no code yet • 8 Mar 2024

Rather, we find a superiority of the Wikipedia domain over the NLI domain for these languages, in contrast to prior studies that focused on NLI as training data.

Meta-Task Prompting Elicits Embedding from Large Language Models

no code yet • 28 Feb 2024

In this work, we introduce a new unsupervised embedding method, Meta-Task Prompting with Explicit One-Word Limitation (MetaEOL), for generating high-quality sentence embeddings from Large Language Models (LLMs) without the need for model fine-tuning or task-specific engineering.

Self-Adaptive Reconstruction with Contrastive Learning for Unsupervised Sentence Embeddings

no code yet • 23 Feb 2024

However, due to the token bias in pretrained language models, the models can not capture the fine-grained semantics in sentences, which leads to poor predictions.