Search Results for author: Linus Ericsson

Found 11 papers, 3 papers with code

Hyperparameter Selection in Continual Learning

no code implementations9 Apr 2024 Thomas L. Lee, Sigrid Passano Hellan, Linus Ericsson, Elliot J. Crowley, Amos Storkey

In continual learning (CL) -- where a learner trains on a stream of data -- standard hyperparameter optimisation (HPO) cannot be applied, as a learner does not have access to all of the data at the same time.

Continual Learning

PlainMamba: Improving Non-Hierarchical Mamba in Visual Recognition

1 code implementation26 Mar 2024 Chenhongyi Yang, Zehui Chen, Miguel Espinosa, Linus Ericsson, Zhenyu Wang, Jiaming Liu, Elliot J. Crowley

In this paper, we further adapt the selective scanning process of Mamba to the visual domain, enhancing its ability to learn features from two-dimensional images by (i) a continuous 2D scanning process that improves spatial continuity by ensuring adjacency of tokens in the scanning sequence, and (ii) direction-aware updating which enables the model to discern the spatial relations of tokens by encoding directional information.

Image Classification Instance Segmentation +3

Better Practices for Domain Adaptation

no code implementations7 Sep 2023 Linus Ericsson, Da Li, Timothy M. Hospedales

However, the domain shift scenario raises a second more subtle challenge: the difficulty of performing hyperparameter optimisation (HPO) for these adaptation algorithms without access to a labelled validation set.

Benchmarking Source-Free Domain Adaptation +2

Label-Efficient Object Detection via Region Proposal Network Pre-Training

no code implementations16 Nov 2022 Nanqing Dong, Linus Ericsson, Yongxin Yang, Ales Leonardis, Steven McDonagh

In this work, we propose a simple pretext task that provides an effective pre-training for the RPN, towards efficiently improving downstream object detection performance.

Instance Segmentation Object +4

Why Do Self-Supervised Models Transfer? Investigating the Impact of Invariance on Downstream Tasks

1 code implementation22 Nov 2021 Linus Ericsson, Henry Gouk, Timothy M. Hospedales

We show that learned invariances strongly affect downstream task performance and confirm that different downstream tasks benefit from polar opposite (in)variances, leading to performance loss when the standard augmentation strategy is used.

Data Augmentation Representation Learning +1

Self-Supervised Representation Learning: Introduction, Advances and Challenges

no code implementations18 Oct 2021 Linus Ericsson, Henry Gouk, Chen Change Loy, Timothy M. Hospedales

Self-supervised representation learning methods aim to provide powerful deep feature learning without the requirement of large annotated datasets, thus alleviating the annotation bottleneck that is one of the main barriers to practical deployment of deep learning today.

Representation Learning

How Well Do Self-Supervised Models Transfer?

1 code implementation CVPR 2021 Linus Ericsson, Henry Gouk, Timothy M. Hospedales

We evaluate the transfer performance of 13 top self-supervised models on 40 downstream tasks, including many-shot and few-shot recognition, object detection, and dense prediction.

Classifier calibration Few-Shot Learning +7

Don't Wait, Just Weight: Improving Unsupervised Representations by Learning Goal-Driven Instance Weights

no code implementations22 Jun 2020 Linus Ericsson, Henry Gouk, Timothy M. Hospedales

We show that by learning Bayesian instance weights for the unlabelled data, we can improve the downstream classification accuracy by prioritising the most useful instances.

Meta-Learning Self-Supervised Learning

Don’t Wait, Just Weight: Improving Unsupervised Representations by Learning Goal-Driven Instance Weights

no code implementations22 Jun 2020 Linus Ericsson

We show that by learning Bayesian instance weights for the unlabelled data, we can improve the downstream classification accuracy by prioritising the most useful instances.

Image Classification Meta-Learning +1

Cannot find the paper you are looking for? You can Submit a new open access paper.