Rotated MNIST
18 papers with code • 1 benchmarks • 1 datasets
Most implemented papers
Domain Generalization using Causal Matching
In the domain generalization literature, a common objective is to learn representations independent of the domain after conditioning on the class label.
CyCNN: A Rotation Invariant CNN using Polar Mapping and Cylindrical Convolution Layers
Deep Convolutional Neural Networks (CNNs) are empirically known to be invariant to moderate translation but not to rotation in image classification.
Learning Partial Equivariances from Data
Frequently, transformations occurring in data can be better represented by a subset of a group than by a group as a whole, e. g., rotations in $[-90^{\circ}, 90^{\circ}]$.
Exploiting Redundancy: Separable Group Convolutional Networks on Lie Groups
In addition, thanks to the increase in computational efficiency, we are able to implement G-CNNs equivariant to the $\mathrm{Sim(2)}$ group; the group of dilations, rotations and translations.
Learning Invariant Representations for Equivariant Neural Networks Using Orthogonal Moments
The final classification layer in equivariant neural networks is invariant to different affine geometric transformations such as rotation, reflection and translation, and the scalar value is obtained by either eliminating the spatial dimensions of filter responses using convolution and down-sampling throughout the network or average is taken over the filter responses.
Learning unfolded networks with a cyclic group structure
Deep neural networks lack straightforward ways to incorporate domain knowledge and are notoriously considered black boxes.
Artificial Neuronal Ensembles with Learned Context Dependent Gating
Finally, there is a regularization term responsible for ensuring that new tasks are encoded in gates that are as orthogonal as possible from previously used ones.