1 code implementation • 13 Apr 2021 • Andres Garcia-Silva, Cristian Berrio, Jose Manuel Gomez-Perez
In this paper we shed light on the impact of fine-tuning over social media data in the internal representations of neural language models.
no code implementations • EACL 2021 • Georg Rehm, Stelios Piperidis, Kalina Bontcheva, Jan Hajic, Victoria Arranz, Andrejs Vasi{\c{l}}jevs, Gerhard Backfried, Jose Manuel Gomez-Perez, Ulrich Germann, R{\'e}mi Calizzano, Nils Feldhus, Stefanie Hegele, Florian Kintzel, Katrin Marheinecke, Julian Moreno-Schneider, Dimitris Galanis, Penny Labropoulou, Miltos Deligiannis, Katerina Gkirtzou, Athanasia Kolovou, Dimitris Gkoumas, Leon Voukoutis, Ian Roberts, Jana Hamrlova, Dusan Varis, Lukas Kacena, Khalid Choukri, Val{\'e}rie Mapelli, Micka{\"e}l Rigault, Julija Melnika, Miro Janosik, Katja Prinz, Andres Garcia-Silva, Cristian Berrio, Ondrej Klejch, Steve Renals
Europe is a multilingual society, in which dozens of languages are spoken.
1 code implementation • WS 2019 • Andres Garcia-Silva, Cristian Berrio, Jos{\'e} Manuel G{\'o}mez-P{\'e}rez
Fine-tuning pre-trained language models has significantly advanced the state of art in a wide range of NLP downstream tasks.