1 code implementation • 23 Aug 2023 • Richa Upadhyay, Ronald Phlypo, Rajkumar Saini, Marcus Liwicki
In this work, we introduce channel-wise l1/l2 group sparsity in the shared convolutional layers parameters (or weights) of the multi-task learning model.
1 code implementation • 13 Oct 2022 • Richa Upadhyay, Prakash Chandra Chhipa, Ronald Phlypo, Rajkumar Saini, Marcus Liwicki
In particular, it focuses simultaneous learning of multiple tasks, an element of MTL and promptly adapting to new tasks, a quality of meta learning.
Ranked #93 on Semantic Segmentation on NYU Depth v2
no code implementations • 23 Nov 2021 • Richa Upadhyay, Ronald Phlypo, Rajkumar Saini, Marcus Liwicki
Integrating knowledge across different domains is an essential feature of human learning.
no code implementations • 14 Mar 2014 • Rémi Flamary, Nisrine Jrad, Ronald Phlypo, Marco Congedo, Alain Rakotomamonjy
This framework is extended to the multi-task learning situation where several similar classification tasks related to different subjects are learned simultaneously.
no code implementations • 29 Mar 2013 • Matthew Anderson, Geng-Shen Fu, Ronald Phlypo, Tülay Adalı
Thus, we provide the additional conditions for when the arbitrary ordering of the sources within each dataset is common.