Semantic Similarity

101 papers with code · Natural Language Processing

The main objective Semantic Similarity is to measure the distance between the semantic meanings of a pair of words, phrases, sentences, or documents. For example, the word “car” is more similar to “bus” than it is to “cat”. The two main approaches to measuring Semantic Similarity are knowledge-based approaches and corpus-based, distributional methods.

Source: Visual and Semantic Knowledge Transfer for Large Scale Semi-supervised Object Detection

Benchmarks

Greatest papers with code

Improving Language Understanding by Generative Pre-Training

Preprint 2018 huggingface/transformers

We demonstrate that large gains on these tasks can be realized by generative pre-training of a language model on a diverse corpus of unlabeled text, followed by discriminative fine-tuning on each specific task.

DOCUMENT CLASSIFICATION LANGUAGE MODELLING NATURAL LANGUAGE INFERENCE NATURAL LANGUAGE UNDERSTANDING QUESTION ANSWERING SEMANTIC SIMILARITY SEMANTIC TEXTUAL SIMILARITY

Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks

IJCNLP 2019 UKPLab/sentence-transformers

However, it requires that both sentences are fed into the network, which causes a massive computational overhead: Finding the most similar pair in a collection of 10, 000 sentences requires about 50 million inference computations (~65 hours) with BERT.

Ranked #5 on Semantic Textual Similarity on STS Benchmark (Spearman Correlation metric)

SEMANTIC SIMILARITY SEMANTIC TEXTUAL SIMILARITY SENTENCE EMBEDDINGS TRANSFER LEARNING

Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks

IJCNLP 2015 tensorflow/fold

Because of their superior ability to preserve sequence information over time, Long Short-Term Memory (LSTM) networks, a type of recurrent neural network with a more complex computational unit, have obtained strong results on a variety of sequence modeling tasks.

SEMANTIC SIMILARITY SENTIMENT ANALYSIS

A Hybrid Neural Network Model for Commonsense Reasoning

WS 2019 namisan/mt-dnn

An HNN consists of two component models, a masked language model and a semantic similarity model, which share a BERT-based contextual encoder but use different model-specific input and output layers.

LANGUAGE MODELLING SEMANTIC SIMILARITY SEMANTIC TEXTUAL SIMILARITY

Photographic Text-to-Image Synthesis with a Hierarchically-nested Adversarial Network

CVPR 2018 ypxie/HDGan

This paper presents a novel method to deal with the challenging task of generating photographic images conditioned on semantic image descriptions.

IMAGE GENERATION SEMANTIC SIMILARITY SEMANTIC TEXTUAL SIMILARITY

No Fuss Distance Metric Learning using Proxies

ICCV 2017 dichotomies/proxy-nca

Traditionally, for this problem supervision is expressed in the form of sets of points that follow an ordinal relationship -- an anchor point $x$ is similar to a set of positive points $Y$, and dissimilar to a set of negative points $Z$, and a loss defined over these distances is minimized.

METRIC LEARNING SEMANTIC SIMILARITY SEMANTIC TEXTUAL SIMILARITY ZERO-SHOT LEARNING

Portuguese Word Embeddings: Evaluating on Word Analogies and Natural Language Tasks

WS 2017 nathanshartmann/portuguese_word_embeddings

Word embeddings have been found to provide meaningful representations for words in an efficient way; therefore, they have become common in Natural Language Processing sys- tems.

SEMANTIC SIMILARITY SEMANTIC TEXTUAL SIMILARITY WORD EMBEDDINGS

Counter-fitting Word Vectors to Linguistic Constraints

NAACL 2016 nmrksic/counter-fitting

In this work, we present a novel counter-fitting method which injects antonymy and synonymy constraints into vector space representations in order to improve the vectors' capability for judging semantic similarity.

DIALOGUE STATE TRACKING SEMANTIC SIMILARITY SEMANTIC TEXTUAL SIMILARITY