no code implementations • 16 Apr 2024 • David Samuel, Lucas Georges Gabriel Charpentier, Sondre Wold
Retrieval-augmented language models pose a promising alternative to standard language modeling.
no code implementations • 3 Nov 2023 • Lucas Georges Gabriel Charpentier, David Samuel
This paper introduces a novel modification of the transformer architecture, tailored for the data-efficient pretraining of language models.
Ranked #6 on Linguistic Acceptability on CoLA
1 code implementation • 19 Apr 2023 • Lucas Georges Gabriel Charpentier, Sondre Wold, David Samuel, Egil Rønningstad
After training, we also separate the language model, which we call the reader, from the retriever components, and show that this can be fine-tuned on a range of downstream tasks.