Regularization

DropPath

Introduced by Larsson et al. in FractalNet: Ultra-Deep Neural Networks without Residuals

Just as dropout prevents co-adaptation of activations, DropPath prevents co-adaptation of parallel paths in networks such as FractalNets by randomly dropping operands of the join layers. This discourages the network from using one input path as an anchor and another as a corrective term (a configuration that, if not prevented, is prone to overfitting). Two sampling strategies are:

  • Local: a join drops each input with fixed probability, but we make sure at least one survives.
  • Global: a single path is selected for the entire network. We restrict this path to be a single column, thereby promoting individual columns as independently strong predictors.
Source: FractalNet: Ultra-Deep Neural Networks without Residuals

Papers


Paper Code Results Date Stars

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories