Task Conditioned Stochastic Subsampling

29 Sep 2021  ·  Andreis Bruno, Seanie Lee, A. Tuan Nguyen, Juho Lee, Eunho Yang, Sung Ju Hwang ·

Deep Learning algorithms are designed to operate on huge volumes of high dimensional data such as images. In order to reduce the volume of data these algorithms must process, we propose a set-based two-stage end-to-end neural subsampling model that is jointly optimized with an \textit{arbitrary} downstream task network such as a classifier. In the first stage, we efficiently subsample \textit{candidate elements} using conditionally independent Bernoulli random variables, followed by conditionally dependent autoregressive subsampling of the candidate elements using Categorical random variables in the second stage. We apply our method to feature and instance selection and show that our method outperforms the relevant baselines under very low subsampling rates on many tasks including image classification, image reconstruction, function reconstruction and few-shot classification. Additionally, for nonparametric models such as Neural Processes that require to leverage whole training data at inference time, we show that our method enhances the scalability of these models. To ensure easy reproducibility, we provide source code in the \textbf{Supplementary Material}.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here