no code implementations • LREC 2022 • Cedric Lothritz, Bertrand Lebichot, Kevin Allix, Lisa Veiber, Tegawende Bissyande, Jacques Klein, Andrey Boytsov, Clément Lefebvre, Anne Goujon
Pre-trained Language Models such as BERT have become ubiquitous in NLP where they have achieved state-of-the-art performance in most NLP tasks.
no code implementations • 2 Mar 2023 • Yewei Song, Saad Ezzini, Jacques Klein, Tegawende Bissyande, Clément Lefebvre, Anne Goujon
We also make use of high-resource languages that are related or share the same linguistic root as the target LRL.
1 code implementation • 11 Oct 2022 • Clément Lefebvre, Niklas Stoehr
In this work, we propose PR-ENT, a new event coding approach that is more flexible and resource-efficient, while maintaining competitive accuracy: first, we extend an event description such as "Military injured two civilians'' by a template, e. g. "People were [Z]" and prompt a pre-trained (cloze) language model to fill the slot Z.