Search Results for author: Amy Hemmeter

Found 3 papers, 3 papers with code

Multiple Word Embeddings for Increased Diversity of Representation

1 code implementation30 Sep 2020 Brian Lester, Daniel Pressel, Amy Hemmeter, Sagnik Ray Choudhury, Srinivas Bangalore

Most state-of-the-art models in natural language processing (NLP) are neural models built on top of large, pre-trained, contextual language models that generate representations of words in context and are fine-tuned for the task at hand.

Word Embeddings

Computationally Efficient NER Taggers with Combined Embeddings and Constrained Decoding

1 code implementation5 Jan 2020 Brian Lester, Daniel Pressel, Amy Hemmeter, Sagnik Ray Choudhury

The CRF layer is used to facilitate global coherence between labels, and the contextual embeddings provide a better representation of words in context.

named-entity-recognition Named Entity Recognition +2

Cannot find the paper you are looking for? You can Submit a new open access paper.