Skip Connection Blocks

ResNeXt Block

Introduced by Xie et al. in Aggregated Residual Transformations for Deep Neural Networks

A ResNeXt Block is a type of residual block used as part of the ResNeXt CNN architecture. It uses a "split-transform-merge" strategy (branched paths within a single module) similar to an Inception module, i.e. it aggregates a set of transformations. Compared to a Residual Block, it exposes a new dimension, cardinality (size of set of transformations) $C$, as an essential factor in addition to depth and width.

Formally, a set of aggregated transformations can be represented as: $\mathcal{F}(x)=\sum_{i=1}^{C}\mathcal{T}_i(x)$, where $\mathcal{T}_i(x)$ can be an arbitrary function. Analogous to a simple neuron, $\mathcal{T}_i$ should project $x$ into an (optionally low-dimensional) embedding and then transform it.

Source: Aggregated Residual Transformations for Deep Neural Networks

Papers


Paper Code Results Date Stars

Categories