Search Results for author: Matthias Rath

Found 4 papers, 0 papers with code

Deep Neural Networks with Efficient Guaranteed Invariances

no code implementations2 Mar 2023 Matthias Rath, Alexandru Paul Condurache

We then address the problem of incorporating multiple desired invariances into a single network.

Improving the Sample-Complexity of Deep Classification Networks with Invariant Integration

no code implementations8 Feb 2022 Matthias Rath, Alexandru Paul Condurache

We demonstrate the improved sample complexity on the Rotated-MNIST, SVHN and CIFAR-10 datasets where rotation-invariant-integration-based Wide-ResNet architectures using monomials and weighted sums outperform the respective baselines in the limited sample regime.

Rotated MNIST

Boosting Deep Neural Networks with Geometrical Prior Knowledge: A Survey

no code implementations30 Jun 2020 Matthias Rath, Alexandru Paul Condurache

One promising approach, inspired by the success of convolutional neural networks in computer vision tasks, is to incorporate knowledge about symmetric geometrical transformations of the problem to solve that affect the output in a predictable way.

3D Object Detection Autonomous Driving +1

Invariant Integration in Deep Convolutional Feature Space

no code implementations20 Apr 2020 Matthias Rath, Alexandru Paul Condurache

In this contribution, we show how to incorporate prior knowledge to a deep neural network architecture in a principled manner.

Rotated MNIST

Cannot find the paper you are looking for? You can Submit a new open access paper.