1 code implementation • 27 Nov 2023 • Ameya Daigavane, Song Kim, Mario Geiger, Tess Smidt
We present Symphony, an $E(3)$-equivariant autoregressive generative model for 3D molecular geometries that iteratively builds a molecule from molecular fragments.
no code implementations • 4 Oct 2023 • Allan dos Santos Costa, Ilan Mitnikov, Mario Geiger, Manvitha Ponnapati, Tess Smidt, Joseph Jacobson
Three-dimensional native states of natural proteins display recurring and hierarchical patterns.
1 code implementation • 1 Mar 2023 • Ivan Diaz, Mario Geiger, Richard Iain McKinley
Convolutional neural networks (CNNs) allow for parameter sharing and translational equivariance by using convolutional kernels in their linear layers.
no code implementations • 31 Jan 2023 • Antonio Sclocchi, Mario Geiger, Matthieu Wyart
They show that SGD noise can be detrimental or instead useful depending on the training regime.
4 code implementations • 18 Jul 2022 • Mario Geiger, Tess Smidt
We present e3nn, a generalized framework for creating E(3) equivariant trainable functions, also known as Euclidean neural networks.
1 code implementation • 16 Jun 2021 • Mario Geiger, Christophe Eloy, Matthieu Wyart
Reinforcement learning is generally difficult for partially observable Markov decision processes (POMDPs), which occurs when the agent's observation is partial or noisy.
no code implementations • NeurIPS 2021 • Oliver T. Unke, Mihail Bogojeski, Michael Gastegger, Mario Geiger, Tess Smidt, Klaus-Robert Müller
Machine learning has enabled the prediction of quantum chemical properties with high accuracy and efficiency, allowing to bypass computationally costly ab initio calculations.
2 code implementations • NeurIPS 2021 • Leonardo Petrini, Alessandro Favero, Mario Geiger, Matthieu Wyart
Understanding why deep nets can classify data in large dimensions remains a challenge.
1 code implementation • 8 Jan 2021 • Simon Batzner, Albert Musaelian, Lixin Sun, Mario Geiger, Jonathan P. Mailoa, Mordechai Kornbluth, Nicola Molinari, Tess E. Smidt, Boris Kozinsky
This work presents Neural Equivariant Interatomic Potentials (NequIP), an E(3)-equivariant neural network approach for learning interatomic potentials from ab-initio calculations for molecular dynamics simulations.
1 code implementation • 30 Dec 2020 • Mario Geiger, Leonardo Petrini, Matthieu Wyart
In this manuscript, we review recent results elucidating (i, ii) and the perspective they offer on the (still unexplained) curse of dimensionality paradox.
1 code implementation • 19 Aug 2020 • Benjamin Kurt Miller, Mario Geiger, Tess E. Smidt, Frank Noé
Equivariant neural networks (ENNs) are graph neural networks embedded in $\mathbb{R}^3$ and are well suited for predicting molecular properties.
1 code implementation • 22 Jul 2020 • Jonas Paccolat, Leonardo Petrini, Mario Geiger, Kevin Tyloo, Matthieu Wyart
We confirm these predictions both for a one-hidden layer FC network trained on the stripe model and for a 16-layers CNN trained on MNIST, for which we also find $\beta_\text{Feature}>\beta_\text{Lazy}$.
1 code implementation • 4 Jul 2020 • Tess E. Smidt, Mario Geiger, Benjamin Kurt Miller
Curie's principle states that "when effects show certain asymmetry, this asymmetry must be found in the causes that gave rise to them".
no code implementations • 19 Jun 2019 • Mario Geiger, Stefano Spigler, Arthur Jacot, Matthieu Wyart
Two distinct limits for deep learning have been derived as the network width $h\rightarrow \infty$, depending on how the weights of the last layer scale with $h$.
no code implementations • 26 May 2019 • Stefano Spigler, Mario Geiger, Matthieu Wyart
We extract $a$ from real data by performing kernel PCA, leading to $\beta\approx0. 36$ for MNIST and $\beta\approx0. 07$ for CIFAR10, in good agreement with observations.
1 code implementation • 6 Jan 2019 • Mario Geiger, Arthur Jacot, Stefano Spigler, Franck Gabriel, Levent Sagun, Stéphane d'Ascoli, Giulio Biroli, Clément Hongler, Matthieu Wyart
At this threshold, we argue that $\|f_{N}\|$ diverges.
no code implementations • NeurIPS 2019 • Taco Cohen, Mario Geiger, Maurice Weiler
Feature maps in these networks represent fields on a homogeneous base space, and layers are equivariant maps between spaces of fields.
no code implementations • 22 Oct 2018 • Stefano Spigler, Mario Geiger, Stéphane d'Ascoli, Levent Sagun, Giulio Biroli, Matthieu Wyart
We argue that in fully-connected networks a phase transition delimits the over- and under-parametrized regimes where fitting can or cannot be achieved.
2 code implementations • 25 Sep 2018 • Mario Geiger, Stefano Spigler, Stéphane d'Ascoli, Levent Sagun, Marco Baity-Jesi, Giulio Biroli, Matthieu Wyart
In the vicinity of this transition, properties of the curvature of the minima of the loss are critical.
1 code implementation • NeurIPS 2018 • Maurice Weiler, Mario Geiger, Max Welling, Wouter Boomsma, Taco Cohen
We prove that equivariant convolutions are the most general equivariant linear maps between fields over R^3.
no code implementations • ICML 2018 • Marco Baity-Jesi, Levent Sagun, Mario Geiger, Stefano Spigler, Gerard Ben Arous, Chiara Cammarota, Yann Lecun, Matthieu Wyart, Giulio Biroli
We analyze numerically the training dynamics of deep neural networks (DNN) by using methods developed in statistical physics of glassy systems.
1 code implementation • 28 Mar 2018 • Taco S. Cohen, Mario Geiger, Maurice Weiler
In algebraic terms, the feature spaces in regular G-CNNs transform according to a regular representation of the group G, whereas the feature spaces in Steerable G-CNNs transform according to the more general induced representations of G. In order to make the network equivariant, each layer in a G-CNN is required to intertwine between the induced representations associated with its input and output space.
3 code implementations • ICLR 2018 • Taco S. Cohen, Mario Geiger, Jonas Koehler, Max Welling
Convolutional Neural Networks (CNNs) have become the method of choice for learning problems involving 2D planar images.
2 code implementations • 14 Sep 2017 • Taco Cohen, Mario Geiger, Jonas Köhler, Max Welling
Many areas of science and egineering deal with signals with other symmetries, such as rotation invariant data on the sphere.