1 code implementation • 15 Jun 2022 • Ali Davody, David Ifeoluwa Adelani, Thomas Kleinbauer, Dietrich Klakow
Transferring knowledge from one domain to another is of practical importance for many tasks in natural language processing, especially when the amount of available data in the target domain is limited.
1 code implementation • NAACL (SocialNLP) 2022 • Dana Ruiter, Thomas Kleinbauer, Cristina España-Bonet, Josef van Genabith, Dietrich Klakow
Recent research on style transfer takes inspiration from unsupervised neural machine translation (UNMT), learning from large amounts of non-parallel data by exploiting cycle consistency loss, back-translation, and denoising autoencoders.
1 code implementation • LREC 2022 • Dana Ruiter, Liane Reiners, Ashwin Geet D'Sa, Thomas Kleinbauer, Dominique Fohr, Irina Illina, Dietrich Klakow, Christian Schemer, Angeliki Monnier
Even though hate speech (HS) online has been an important object of research in the last decade, most HS-related corpora over-simplify the phenomenon of hate by attempting to label user comments as "hate" or "neutral".
1 code implementation • EMNLP 2021 • David Ifeoluwa Adelani, Miaoran Zhang, Xiaoyu Shen, Ali Davody, Thomas Kleinbauer, Dietrich Klakow
Documents as short as a single sentence may inadvertently reveal sensitive information about their authors, including e. g. their gender or ethnicity.
1 code implementation • ACL (WOAH) 2021 • Vanessa Hahn, Dana Ruiter, Thomas Kleinbauer, Dietrich Klakow
We observe that, on both similar and distant target tasks and across all languages, the subspace-based representations transfer more effectively than standard BERT representations in the zero-shot setting, with improvements between F1 +10. 9 and F1 +42. 9 over the baselines across all tested monolingual and cross-lingual scenarios.
1 code implementation • 7 Aug 2020 • David Ifeoluwa Adelani, Ali Davody, Thomas Kleinbauer, Dietrich Klakow
Machine Learning approaches to Natural Language Processing tasks benefit from a comprehensive collection of real-life user data.
1 code implementation • 19 Jun 2020 • Ali Davody, David Ifeoluwa Adelani, Thomas Kleinbauer, Dietrich Klakow
Differentially private stochastic gradient descent (DPSGD) is a variation of stochastic gradient descent based on the Differential Privacy (DP) paradigm, which can mitigate privacy threats that arise from the presence of sensitive information in training data.