1 code implementation • 23 Aug 2023 • Richa Upadhyay, Ronald Phlypo, Rajkumar Saini, Marcus Liwicki
In this work, we introduce channel-wise l1/l2 group sparsity in the shared convolutional layers parameters (or weights) of the multi-task learning model.
1 code implementation • 12 Mar 2023 • Prakash Chandra Chhipa, Muskaan Chopra, Gopal Mengi, Varun Gupta, Richa Upadhyay, Meenakshi Subhash Chippa, Kanjar De, Rajkumar Saini, Seiichi Uchida, Marcus Liwicki
This work investigates the unexplored usability of self-supervised representation learning in the direction of functional knowledge transfer.
1 code implementation • 18 Oct 2022 • Prakash Chandra Chhipa, Richa Upadhyay, Rajkumar Saini, Lars Lindqvist, Richard Nordenskjold, Seiichi Uchida, Marcus Liwicki
This work presents a novel self-supervised representation learning method to learn efficient representations without labels on images from a 3DPM sensor (3-Dimensional Particle Measurement; estimates the particle size distribution of material) utilizing RGB images and depth maps of mining material on the conveyor belt.
1 code implementation • 13 Oct 2022 • Richa Upadhyay, Prakash Chandra Chhipa, Ronald Phlypo, Rajkumar Saini, Marcus Liwicki
In particular, it focuses simultaneous learning of multiple tasks, an element of MTL and promptly adapting to new tasks, a quality of meta learning.
Ranked #93 on Semantic Segmentation on NYU Depth v2
no code implementations • 5 May 2022 • Foteini Simistira Liwicki, Richa Upadhyay, Prakash Chandra Chhipa, Killian Murphy, Federico Visi, Stefan Östersjö, Marcus Liwicki
While this idea was proposed in a previous study, this paper introduces several novelties: (i) Presents a novel method to overcome the class imbalance challenge and make learning possible for co-existent gestures by batch balancing approach and spatial-temporal representations of gestures.
1 code implementation • 15 Mar 2022 • Prakash Chandra Chhipa, Richa Upadhyay, Gustav Grund Pihlgren, Rajkumar Saini, Seiichi Uchida, Marcus Liwicki
This work presents a novel self-supervised pre-training method to learn efficient representations without labels on histopathology medical images utilizing magnification factors.
Ranked #1 on Breast Cancer Histology Image Classification on BreakHis (Accuracy (Inter-Patient) metric)
Breast Cancer Histology Image Classification (20% labels) Classification Of Breast Cancer Histology Images +2
no code implementations • 23 Nov 2021 • Richa Upadhyay, Ronald Phlypo, Rajkumar Saini, Marcus Liwicki
Integrating knowledge across different domains is an essential feature of human learning.