Search Results for author: Lucas Georges Gabriel Charpentier

Found 3 papers, 1 papers with code

Not all layers are equally as important: Every Layer Counts BERT

no code implementations3 Nov 2023 Lucas Georges Gabriel Charpentier, David Samuel

This paper introduces a novel modification of the transformer architecture, tailored for the data-efficient pretraining of language models.

Linguistic Acceptability Natural Language Inference

BRENT: Bidirectional Retrieval Enhanced Norwegian Transformer

1 code implementation19 Apr 2023 Lucas Georges Gabriel Charpentier, Sondre Wold, David Samuel, Egil Rønningstad

After training, we also separate the language model, which we call the reader, from the retriever components, and show that this can be fine-tuned on a range of downstream tasks.

Dependency Parsing Extractive Question-Answering +7

Cannot find the paper you are looking for? You can Submit a new open access paper.