Search Results for author: Wolfgang Krämer

Found 1 papers, 0 papers with code

Controlled Randomness Improves the Performance of Transformer Models

no code implementations20 Oct 2023 Tobias Deußer, Cong Zhao, Wolfgang Krämer, David Leonhard, Christian Bauckhage, Rafet Sifa

During the pre-training step of natural language models, the main objective is to learn a general representation of the pre-training dataset, usually requiring large amounts of textual data to capture the complexity and diversity of natural language.

named-entity-recognition Named Entity Recognition +2

Cannot find the paper you are looking for? You can Submit a new open access paper.