Search Results for author: Evangelia Gogoulou

Found 8 papers, 1 papers with code

Continual Learning Under Language Shift

no code implementations2 Nov 2023 Evangelia Gogoulou, Timothée Lesort, Magnus Boman, Joakim Nivre

The recent increase in data and model scale for language model pre-training has led to huge training costs.

Continual Learning Language Modelling

The Nordic Pile: A 1.2TB Nordic Dataset for Language Modeling

no code implementations30 Mar 2023 Joey Öhman, Severine Verlinden, Ariel Ekgren, Amaru Cuba Gyllensten, Tim Isbister, Evangelia Gogoulou, Fredrik Carlsson, Magnus Sahlgren

Pre-training Large Language Models (LLMs) require massive amounts of text data, and the performance of the LLMs typically correlates with the scale and quality of the datasets.

Language Modelling

Cross-lingual Transfer of Monolingual Models

no code implementations LREC 2022 Evangelia Gogoulou, Ariel Ekgren, Tim Isbister, Magnus Sahlgren

Additionally, the results of evaluating the transferred models in source language tasks reveal that their performance in the source domain deteriorates after transfer.

Cross-Lingual Transfer Domain Adaptation

Predicting Treatment Outcome from Patient Texts:The Case of Internet-Based Cognitive Behavioural Therapy

no code implementations EACL 2021 Evangelia Gogoulou, Magnus Boman, Fehmi ben Abdesslem, Nils Hentati Isacsson, Viktor Kaldo, Magnus Sahlgren

We investigate the feasibility of applying standard text categorisation methods to patient text in order to predict treatment outcome in Internet-based cognitive behavioural therapy.

Sentiment Analysis

Deep Representational Re-tuning using Contrastive Tension

1 code implementation ICLR 2021 Fredrik Carlsson, Amaru Cuba Gyllensten, Evangelia Gogoulou, Erik Ylipää Hellqvist, Magnus Sahlgren

Extracting semantically useful natural language sentence representations from pre-trained deep neural networks such as Transformers remains a challenge.

Semantic Similarity Semantic Textual Similarity +3

SenseCluster at SemEval-2020 Task 1: Unsupervised Lexical Semantic Change Detection

no code implementations SEMEVAL 2020 Amaru Cuba Gyllensten, Evangelia Gogoulou, Ariel Ekgren, Magnus Sahlgren

We (Team Skurt) propose a simple method to detect lexical semantic change by clustering contextualized embeddings produced by XLM-R, using K-Means++.

Change Detection Clustering +1

Cannot find the paper you are looking for? You can Submit a new open access paper.