# Semantic Similarity Edit

101 papers with code · Natural Language Processing

The main objective Semantic Similarity is to measure the distance between the semantic meanings of a pair of words, phrases, sentences, or documents. For example, the word “car” is more similar to “bus” than it is to “cat”. The two main approaches to measuring Semantic Similarity are knowledge-based approaches and corpus-based, distributional methods.

TREND DATASET BEST METHOD PAPER TITLE PAPER CODE COMPARE

# Improving Language Understanding by Generative Pre-Training

We demonstrate that large gains on these tasks can be realized by generative pre-training of a language model on a diverse corpus of unlabeled text, followed by discriminative fine-tuning on each specific task.

31,991

# ERNIE: Enhanced Representation through Knowledge Integration

We present a novel language representation model enhanced by knowledge called ERNIE (Enhanced Representation through kNowledge IntEgration).

3,493

# Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks

However, it requires that both sentences are fed into the network, which causes a massive computational overhead: Finding the most similar pair in a collection of 10, 000 sentences requires about 50 million inference computations (~65 hours) with BERT.

Ranked #5 on Semantic Textual Similarity on STS Benchmark (Spearman Correlation metric)

2,524

# Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks

Because of their superior ability to preserve sequence information over time, Long Short-Term Memory (LSTM) networks, a type of recurrent neural network with a more complex computational unit, have obtained strong results on a variety of sequence modeling tasks.

1,793

# A Hybrid Neural Network Model for Commonsense Reasoning

An HNN consists of two component models, a masked language model and a semantic similarity model, which share a BERT-based contextual encoder but use different model-specific input and output layers.

1,447

189

# Photographic Text-to-Image Synthesis with a Hierarchically-nested Adversarial Network

This paper presents a novel method to deal with the challenging task of generating photographic images conditioned on semantic image descriptions.

145

# No Fuss Distance Metric Learning using Proxies

Traditionally, for this problem supervision is expressed in the form of sets of points that follow an ordinal relationship -- an anchor point $x$ is similar to a set of positive points $Y$, and dissimilar to a set of negative points $Z$, and a loss defined over these distances is minimized.

124

# Portuguese Word Embeddings: Evaluating on Word Analogies and Natural Language Tasks

Word embeddings have been found to provide meaningful representations for words in an efficient way; therefore, they have become common in Natural Language Processing sys- tems.

120

# Counter-fitting Word Vectors to Linguistic Constraints

In this work, we present a novel counter-fitting method which injects antonymy and synonymy constraints into vector space representations in order to improve the vectors' capability for judging semantic similarity.

110