Structured Convolutions for Efficient Neural Network Design

In this work, we tackle model efficiency by exploiting redundancy in the \textit{implicit structure} of the building blocks of convolutional neural networks. We start our analysis by introducing a general definition of Composite Kernel structures that enable the execution of convolution operations in the form of efficient, scaled, sum-pooling components... (read more)

Results in Papers With Code
(↓ scroll down to see all results)