Search Results for author: Konrad Sperfeld

Found 1 papers, 1 papers with code

Optimizing small BERTs trained for German NER

2 code implementations23 Apr 2021 Jochen Zöllner, Konrad Sperfeld, Christoph Wick, Roger Labahn

Currently, the most widespread neural network architecture for training language models is the so called BERT which led to improvements in various Natural Language Processing (NLP) tasks.

named-entity-recognition Named Entity Recognition +1

Cannot find the paper you are looking for? You can Submit a new open access paper.