Search Results for author: Mateusz Klimaszewski

Found 5 papers, 0 papers with code

No Train but Gain: Language Arithmetic for training-free Language Adapters enhancement

no code implementations24 Apr 2024 Mateusz Klimaszewski, Piotr Andruszkiewicz, Alexandra Birch

Modular deep learning is the state-of-the-art solution for lifting the curse of multilinguality, preventing the impact of negative interference and enabling cross-lingual performance in Multilingual Pre-trained Language Models.

Transfer Learning

Is Modularity Transferable? A Case Study through the Lens of Knowledge Distillation

no code implementations27 Mar 2024 Mateusz Klimaszewski, Piotr Andruszkiewicz, Alexandra Birch

Moreover, we propose a method that allows the transfer of modules between incompatible PLMs without any change in the inference complexity.

Domain Adaptation Knowledge Distillation +6

COMBO: State-of-the-Art Morphosyntactic Analysis

no code implementations EMNLP (ACL) 2021 Mateusz Klimaszewski, Alina Wróblewska

We introduce COMBO - a fully neural NLP system for accurate part-of-speech tagging, morphological analysis, lemmatisation, and (enhanced) dependency parsing.

Dependency Parsing Morphological Analysis +1

COMBO: a new module for EUD parsing

no code implementations ACL (IWPT) 2021 Mateusz Klimaszewski, Alina Wróblewska

We introduce the COMBO-based approach for EUD parsing and its implementation, which took part in the IWPT 2021 EUD shared task.

WUT at SemEval-2019 Task 9: Domain-Adversarial Neural Networks for Domain Adaptation in Suggestion Mining

no code implementations SEMEVAL 2019 Mateusz Klimaszewski, Piotr Andruszkiewicz

We present a system for cross-domain suggestion mining, prepared for the SemEval-2019 Task 9: Suggestion Mining from Online Reviews and Forums (Subtask B).

Domain Adaptation General Classification +4

Cannot find the paper you are looking for? You can Submit a new open access paper.