Sentence

3412 papers with code • 0 benchmarks • 0 datasets

This task has no description! Would you like to contribute one?

Libraries

Use these libraries to find Sentence models and implementations

Most implemented papers

Bidirectional LSTM-CRF Models for Sequence Tagging

determined22/zh-ner-tf 9 Aug 2015

It can also use sentence level tag information thanks to a CRF layer.

CIDEr: Consensus-based Image Description Evaluation

tylin/coco-caption CVPR 2015

We propose a novel paradigm for evaluating image descriptions that uses human consensus.

Universal Sentence Encoder

facebookresearch/InferSent 29 Mar 2018

For both variants, we investigate and report the relationship between model complexity, resource consumption, the availability of transfer task training data, and task performance.

Supervised Learning of Universal Sentence Representations from Natural Language Inference Data

facebookresearch/InferSent EMNLP 2017

Many modern NLP systems rely on word embeddings, previously trained in an unsupervised manner on large corpora, as base features.

SimCSE: Simple Contrastive Learning of Sentence Embeddings

princeton-nlp/SimCSE EMNLP 2021

This paper presents SimCSE, a simple contrastive learning framework that greatly advances state-of-the-art sentence embeddings.

A Neural Conversational Model

farizrahman4u/seq2seq 19 Jun 2015

We find that this straightforward model can generate simple conversations given a large conversational training dataset.

A Sensitivity Analysis of (and Practitioners' Guide to) Convolutional Neural Networks for Sentence Classification

brightmart/text_classification IJCNLP 2017

Convolutional Neural Networks (CNNs) have recently achieved remarkably strong performance on the practically important task of sentence classification (kim 2014, kalchbrenner 2014, johnson 2014).

Show and Tell: Lessons learned from the 2015 MSCOCO Image Captioning Challenge

tensorflow/models 21 Sep 2016

Automatically describing the content of an image is a fundamental problem in artificial intelligence that connects computer vision and natural language processing.

Text Summarization with Pretrained Encoders

nlpyang/PreSumm IJCNLP 2019

For abstractive summarization, we propose a new fine-tuning schedule which adopts different optimizers for the encoder and the decoder as a means of alleviating the mismatch between the two (the former is pretrained while the latter is not).

Generating Sentences from a Continuous Space

PaddlePaddle/PaddleNLP CONLL 2016

The standard recurrent neural network language model (RNNLM) generates sentences one word at a time and does not work from an explicit global sentence representation.