Search Results for author: Wonho Bae

Found 10 papers, 2 papers with code

Rethinking Class Activation Mapping for Weakly Supervised Object Localization

1 code implementation ECCV 2020 Wonho Bae, Junhyug Noh, Gunhee Kim

Weakly supervised object localization (WSOL) is a task of localizing an object in an image only using image-level labels.

Object Weakly-Supervised Object Localization

AdaFlood: Adaptive Flood Regularization

no code implementations6 Nov 2023 Wonho Bae, Yi Ren, Mohamad Osama Ahmed, Frederick Tung, Danica J. Sutherland, Gabriel L. Oliveira

Although neural networks are conventionally optimized towards zero training loss, it has been recently learned that targeting a non-zero training loss threshold, referred to as a flood level, often enables better test time generalization.

Exploring Active Learning in Meta-Learning: Enhancing Context Set Labeling

no code implementations6 Nov 2023 Wonho Bae, Jing Wang, Danica J. Sutherland

Most meta-learning methods assume that the (very small) context set used to establish a new task at test time is passively provided.

Active Learning Meta-Learning

How to prepare your task head for finetuning

no code implementations11 Feb 2023 Yi Ren, Shangmin Guo, Wonho Bae, Danica J. Sutherland

We identify a significant trend in the effect of changes in this initial energy on the resulting features after fine-tuning.

Meta Temporal Point Processes

no code implementations27 Jan 2023 Wonho Bae, Mohamed Osama Ahmed, Frederick Tung, Gabriel L. Oliveira

In this work, we propose to train TPPs in a meta learning framework, where each sequence is treated as a different task, via a novel framing of TPPs as neural processes (NPs).

Meta-Learning Point Processes

Object Discovery via Contrastive Learning for Weakly Supervised Object Detection

1 code implementation16 Aug 2022 Jinhwan Seo, Wonho Bae, Danica J. Sutherland, Junhyug Noh, Daijin Kim

Weakly Supervised Object Detection (WSOD) is a task that detects objects in an image using a model trained only on image-level annotations.

Contrastive Learning Object +2

Making Look-Ahead Active Learning Strategies Feasible with Neural Tangent Kernels

no code implementations25 Jun 2022 Mohamad Amin Mohamadi, Wonho Bae, Danica J. Sutherland

We propose a new method for approximating active learning acquisition strategies that are based on retraining with hypothetically-labeled candidate data points.

Active Learning

A Fast, Well-Founded Approximation to the Empirical Neural Tangent Kernel

no code implementations25 Jun 2022 Mohamad Amin Mohamadi, Wonho Bae, Danica J. Sutherland

Empirical neural tangent kernels (eNTKs) can provide a good understanding of a given network's representation: they are often far less expensive to compute and applicable more broadly than infinite width NTKs.

One Weird Trick to Improve Your Semi-Weakly Supervised Semantic Segmentation Model

no code implementations2 May 2022 Wonho Bae, Junhyug Noh, Milad Jalali Asadabadi, Danica J. Sutherland

Semi-weakly supervised semantic segmentation (SWSSS) aims to train a model to identify objects in images based on a small number of images with pixel-level labels, and many more images with only image-level labels.

Pseudo Label Segmentation +2

Cannot find the paper you are looking for? You can Submit a new open access paper.