Sentence
3412 papers with code • 0 benchmarks • 0 datasets
Benchmarks
These leaderboards are used to track progress in Sentence
Libraries
Use these libraries to find Sentence models and implementationsMost implemented papers
Bidirectional LSTM-CRF Models for Sequence Tagging
It can also use sentence level tag information thanks to a CRF layer.
CIDEr: Consensus-based Image Description Evaluation
We propose a novel paradigm for evaluating image descriptions that uses human consensus.
Universal Sentence Encoder
For both variants, we investigate and report the relationship between model complexity, resource consumption, the availability of transfer task training data, and task performance.
Supervised Learning of Universal Sentence Representations from Natural Language Inference Data
Many modern NLP systems rely on word embeddings, previously trained in an unsupervised manner on large corpora, as base features.
SimCSE: Simple Contrastive Learning of Sentence Embeddings
This paper presents SimCSE, a simple contrastive learning framework that greatly advances state-of-the-art sentence embeddings.
A Neural Conversational Model
We find that this straightforward model can generate simple conversations given a large conversational training dataset.
A Sensitivity Analysis of (and Practitioners' Guide to) Convolutional Neural Networks for Sentence Classification
Convolutional Neural Networks (CNNs) have recently achieved remarkably strong performance on the practically important task of sentence classification (kim 2014, kalchbrenner 2014, johnson 2014).
Show and Tell: Lessons learned from the 2015 MSCOCO Image Captioning Challenge
Automatically describing the content of an image is a fundamental problem in artificial intelligence that connects computer vision and natural language processing.
Text Summarization with Pretrained Encoders
For abstractive summarization, we propose a new fine-tuning schedule which adopts different optimizers for the encoder and the decoder as a means of alleviating the mismatch between the two (the former is pretrained while the latter is not).
Generating Sentences from a Continuous Space
The standard recurrent neural network language model (RNNLM) generates sentences one word at a time and does not work from an explicit global sentence representation.