no code implementations • 4 Mar 2022 • Alvaro H. C. Correia, Daniel E. Worrall, Roberto Bondesan
Simulated annealing (SA) is a stochastic global optimisation technique applicable to a wide range of discrete and continuous variable problems.
1 code implementation • 15 Feb 2022 • Johannes Brandstetter, Max Welling, Daniel E. Worrall
In this paper, we present a method, which can partially alleviate this problem, by improving neural PDE solver sample complexity -- Lie point symmetry data augmentation (LPSDA).
2 code implementations • NeurIPS 2020 • Elise van der Pol, Daniel E. Worrall, Herke van Hoof, Frans A. Oliehoek, Max Welling
MDP homomorphic networks are neural networks that are equivariant under symmetries in the joint state-action space of an MDP.
5 code implementations • NeurIPS 2020 • Fabian B. Fuchs, Daniel E. Worrall, Volker Fischer, Max Welling
We introduce the SE(3)-Transformer, a variant of the self-attention module for 3D point clouds and graphs, which is equivariant under continuous 3D roto-translations.
no code implementations • 18 Nov 2019 • Nichita Diaconu, Daniel E. Worrall
We also modify the Squeeze and Excitation variant of attention, extending both variants of attention to the roto-translation group.
1 code implementation • NeurIPS 2019 • Daniel E. Worrall, Max Welling
We introduce deep scale-spaces (DSS), a generalization of convolutional neural networks, exploiting the scale symmetry structure of conventional image recognition tasks.
no code implementations • 12 May 2019 • Nichita Diaconu, Daniel E. Worrall
In this paper, we learn how to transform filters for use in the group convolution, focussing on roto-translation.
3 code implementations • CVPR 2019 • Tycho F. A. van der Ouderaa, Daniel E. Worrall
The Pix2pix and CycleGAN losses have vastly improved the qualitative and quantitative visual quality of results in image-to-image translation tasks.
2 code implementations • 20 Nov 2017 • Saki Shinoda, Daniel E. Worrall, Gabriel J. Brostow
Semi-supervised learning (SSL) partially circumvents the high cost of labeling data by augmenting a small labeled dataset with a large and relatively cheap unlabeled dataset drawn from the same distribution.
no code implementations • ICCV 2017 • Daniel E. Worrall, Stephan J. Garbin, Daniyar Turmukhambetov, Gabriel J. Brostow
We propose a simple method to construct a deep feature space, with explicitly disentangled representations of several known transformations.
no code implementations • 1 May 2017 • Ryutaro Tanno, Daniel E. Worrall, Aurobrata Ghosh, Enrico Kaden, Stamatios N. Sotiropoulos, Antonio Criminisi, Daniel C. Alexander
In this work, we investigate the value of uncertainty modeling in 3D super-resolution with convolutional neural networks (CNNs).
1 code implementation • CVPR 2017 • Daniel E. Worrall, Stephan J. Garbin, Daniyar Turmukhambetov, Gabriel J. Brostow
This is not the case for rotations.