no code implementations • 11 Dec 2023 • Soniya Vijayakumar, Tanja Bäumel, Simon Ostermann, Josef van Genabith
Pre-trained Language Models (PLMs) have shown to be consistently successful in a plethora of NLP tasks due to their ability to learn contextualized representations of words (Ethayarajh, 2019).
no code implementations • 14 Nov 2023 • Tanja Baeumel, Soniya Vijayakumar, Josef van Genabith, Guenter Neumann, Simon Ostermann
Pretrained language models (PLMs) form the basis of most state-of-the-art NLP technologies.
no code implementations • 22 Jan 2023 • Soniya Vijayakumar
The field of natural language processing has reached breakthroughs with the advent of transformers.