Search Results for author: Lovre Torbarina

Found 2 papers, 0 papers with code

Speeding Up Transformer Training By Using Dataset Subsampling - An Exploratory Analysis

no code implementations EMNLP (sustainlp) 2021 Lovre Torbarina, Velimir Mihelčić, Bruno Šarlija, Lukasz Roguski, Željko Kraljević

Transformer-based models have greatly advanced the progress in the field of the natural language processing and while they achieve state-of-the-art results on a wide range of tasks, they are cumbersome in parameter size.

text-classification Text Classification

Challenges and Opportunities of Using Transformer-Based Multi-Task Learning in NLP Through ML Lifecycle: A Survey

no code implementations16 Aug 2023 Lovre Torbarina, Tin Ferkovic, Lukasz Roguski, Velimir Mihelcic, Bruno Sarlija, Zeljko Kraljevic

The increasing adoption of natural language processing (NLP) models across industries has led to practitioners' need for machine learning systems to handle these models efficiently, from training to serving them in production.

Continual Learning Multi-Task Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.