Search Results for author: Anastasia Dietrich

Found 2 papers, 0 papers with code

Towards Structured Dynamic Sparse Pre-Training of BERT

no code implementations13 Aug 2021 Anastasia Dietrich, Frithjof Gressmann, Douglas Orr, Ivan Chelombiev, Daniel Justus, Carlo Luschi

Identifying algorithms for computational efficient unsupervised training of large language models is an important and active area of research.

Language Modelling

Cannot find the paper you are looking for? You can Submit a new open access paper.