Effective Version Space Reduction for Convolutional Neural Networks

22 Jun 2020  ·  Jiayu Liu, Ioannis Chiotellis, Rudolph Triebel, Daniel Cremers ·

In active learning, sampling bias could pose a serious inconsistency problem and hinder the algorithm from finding the optimal hypothesis. However, many methods for neural networks are hypothesis space agnostic and do not address this problem. We examine active learning with convolutional neural networks through the principled lens of version space reduction. We identify the connection between two approaches---prior mass reduction and diameter reduction---and propose a new diameter-based querying method---the minimum Gibbs-vote disagreement. By estimating version space diameter and bias, we illustrate how version space of neural networks evolves and examine the realizability assumption. With experiments on MNIST, Fashion-MNIST, SVHN and STL-10 datasets, we demonstrate that diameter reduction methods reduce the version space more effectively and perform better than prior mass reduction and other baselines, and that the Gibbs vote disagreement is on par with the best query method.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Image Classification STL-10 M2-PWD Percentage correct 57.31 # 117
Image Classification STL-10 GVD Percentage correct 59.33 # 109
Image Classification STL-10 PWD Percentage correct 59.45 # 108
Image Classification STL-10 DFAL Percentage correct 58.81 # 113
Image Classification STL-10 Core SET Percentage correct 58.93 # 111
Image Classification STL-10 BALD-MCD Percentage correct 57.35 # 115
Image Classification STL-10 GE Percentage correct 58.84 # 112
Image Classification STL-10 Random Percentage correct 58.15 # 114
Image Classification STL-10 VR Percentage correct 59.13 # 110

Methods


No methods listed for this paper. Add relevant methods here