Search Results for author: Shinnosuke Matsuo

Found 8 papers, 5 papers with code

Counting Network for Learning from Majority Label

1 code implementation20 Mar 2024 Kaito Shiku, Shinnosuke Matsuo, Daiki Suehiro, Ryoma Bise

Existing MIL methods are unsuitable for LML due to aggregating confidences, which may lead to inconsistency between the bag-level label and the label obtained by counting the number of instances for each class.

Multiple Instance Learning

Boosting for Bounding the Worst-class Error

no code implementations20 Oct 2023 Yuya Saito, Shinnosuke Matsuo, Seiichi Uchida, Daiki Suehiro

This paper tackles the problem of the worst-class error rate, instead of the standard error rate averaged over all classes.

Image Classification Medical Image Classification

Deep Attentive Time Warping

1 code implementation13 Sep 2023 Shinnosuke Matsuo, Xiaomeng Wu, Gantugs Atarsaikhan, Akisato Kimura, Kunio Kashino, Brian Kenji Iwana, Seiichi Uchida

Unlike other learnable models using DTW for warping, our model predicts all local correspondences between two time series and is trained based on metric learning, which enables it to learn the optimal data-dependent warping for the target task.

Dynamic Time Warping Metric Learning +2

MixBag: Bag-Level Data Augmentation for Learning from Label Proportions

no code implementations ICCV 2023 Takanori Asanomi, Shinnosuke Matsuo, Daiki Suehiro, Ryoma Bise

In this paper, we propose a bag-level data augmentation method for LLP called MixBag, based on the key observation from our preliminary experiments; that the instance-level classification accuracy improves as the number of labeled bags increases even though the total number of instances is fixed.

Data Augmentation Weakly-supervised Learning

Learning from Label Proportion with Online Pseudo-Label Decision by Regret Minimization

1 code implementation17 Feb 2023 Shinnosuke Matsuo, Ryoma Bise, Seiichi Uchida, Daiki Suehiro

This paper proposes a novel and efficient method for Learning from Label Proportions (LLP), whose goal is to train a classifier only by using the class label proportions of instance sets, called bags.

Pseudo Label

Dynamic Data Augmentation with Gating Networks for Time Series Recognition

1 code implementation5 Nov 2021 Daisuke Oba, Shinnosuke Matsuo, Brian Kenji Iwana

We propose a neural network that dynamically selects the best combination of data augmentation methods using a mutually beneficial gating network and a feature consistency loss.

Data Augmentation Time Series +1

Self-Augmented Multi-Modal Feature Embedding

no code implementations8 Mar 2021 Shinnosuke Matsuo, Seiichi Uchida, Brian Kenji Iwana

To exploit this fact, we propose the use of self-augmentation and combine it with multi-modal feature embedding.

Cannot find the paper you are looking for? You can Submit a new open access paper.