Search Results for author: Clayton Greenberg

Found 9 papers, 1 papers with code

Long-Short Range Context Neural Networks for Language Modeling

no code implementations EMNLP 2016 Youssef Oualil, Mittul Singh, Clayton Greenberg, Dietrich Klakow

The goal of language modeling techniques is to capture the statistical and structural properties of natural languages from training corpora.

Language Modelling Text Compression

Sequential Recurrent Neural Networks for Language Modeling

no code implementations23 Mar 2017 Youssef Oualil, Clayton Greenberg, Mittul Singh, Dietrich Klakow

Feedforward Neural Network (FNN)-based language models estimate the probability of the next word based on the history of the last N words, whereas Recurrent Neural Networks (RNN) perform the same task based only on the last word and some context information that cycles in the network.

Language Modelling Text Compression

Sub-Word Similarity based Search for Embeddings: Inducing Rare-Word Embeddings for Word Similarity Tasks and Language Modelling

no code implementations COLING 2016 Mittul Singh, Clayton Greenberg, Youssef Oualil, Dietrich Klakow

We augmented pre-trained word embeddings with these novel embeddings and evaluated on a rare word similarity task, obtaining up to 3 times improvement in correlation over the original set of embeddings.

Language Modelling Morphological Analysis +2

Cannot find the paper you are looking for? You can Submit a new open access paper.