no code implementations • 26 Oct 2023 • Ahmed Alajrami, Katerina Margatina, Nikolaos Aletras
Understanding how and what pre-trained language models (PLMs) learn about language is an open challenge in natural language processing.
1 code implementation • ACL 2022 • Ahmed Alajrami, Nikolaos Aletras
Several pre-training objectives, such as masked language modeling (MLM), have been proposed to pre-train language models (e. g. BERT) with the aim of learning better language representations.