Search Results for author: Mario Geiger

Found 24 papers, 16 papers with code

Symphony: Symmetry-Equivariant Point-Centered Spherical Harmonics for Molecule Generation

1 code implementation27 Nov 2023 Ameya Daigavane, Song Kim, Mario Geiger, Tess Smidt

We present Symphony, an $E(3)$-equivariant autoregressive generative model for 3D molecular geometries that iteratively builds a molecule from molecular fragments.

An end-to-end SE(3)-equivariant segmentation network

1 code implementation1 Mar 2023 Ivan Diaz, Mario Geiger, Richard Iain McKinley

Convolutional neural networks (CNNs) allow for parameter sharing and translational equivariance by using convolutional kernels in their linear layers.

Data Augmentation Segmentation

Dissecting the Effects of SGD Noise in Distinct Regimes of Deep Learning

no code implementations31 Jan 2023 Antonio Sclocchi, Mario Geiger, Matthieu Wyart

They show that SGD noise can be detrimental or instead useful depending on the training regime.

e3nn: Euclidean Neural Networks

4 code implementations18 Jul 2022 Mario Geiger, Tess Smidt

We present e3nn, a generalized framework for creating E(3) equivariant trainable functions, also known as Euclidean neural networks.

How memory architecture affects learning in a simple POMDP: the two-hypothesis testing problem

1 code implementation16 Jun 2021 Mario Geiger, Christophe Eloy, Matthieu Wyart

Reinforcement learning is generally difficult for partially observable Markov decision processes (POMDPs), which occurs when the agent's observation is partial or noisy.

SE(3)-equivariant prediction of molecular wavefunctions and electronic densities

no code implementations NeurIPS 2021 Oliver T. Unke, Mihail Bogojeski, Michael Gastegger, Mario Geiger, Tess Smidt, Klaus-Robert Müller

Machine learning has enabled the prediction of quantum chemical properties with high accuracy and efficiency, allowing to bypass computationally costly ab initio calculations.

Transfer Learning

E(3)-Equivariant Graph Neural Networks for Data-Efficient and Accurate Interatomic Potentials

1 code implementation8 Jan 2021 Simon Batzner, Albert Musaelian, Lixin Sun, Mario Geiger, Jonathan P. Mailoa, Mordechai Kornbluth, Nicola Molinari, Tess E. Smidt, Boris Kozinsky

This work presents Neural Equivariant Interatomic Potentials (NequIP), an E(3)-equivariant neural network approach for learning interatomic potentials from ab-initio calculations for molecular dynamics simulations.

Perspective: A Phase Diagram for Deep Learning unifying Jamming, Feature Learning and Lazy Training

1 code implementation30 Dec 2020 Mario Geiger, Leonardo Petrini, Matthieu Wyart

In this manuscript, we review recent results elucidating (i, ii) and the perspective they offer on the (still unexplained) curse of dimensionality paradox.

Relevance of Rotationally Equivariant Convolutions for Predicting Molecular Properties

1 code implementation19 Aug 2020 Benjamin Kurt Miller, Mario Geiger, Tess E. Smidt, Frank Noé

Equivariant neural networks (ENNs) are graph neural networks embedded in $\mathbb{R}^3$ and are well suited for predicting molecular properties.

Molecular Property Prediction Property Prediction

Geometric compression of invariant manifolds in neural nets

1 code implementation22 Jul 2020 Jonas Paccolat, Leonardo Petrini, Mario Geiger, Kevin Tyloo, Matthieu Wyart

We confirm these predictions both for a one-hidden layer FC network trained on the stripe model and for a 16-layers CNN trained on MNIST, for which we also find $\beta_\text{Feature}>\beta_\text{Lazy}$.

Finding Symmetry Breaking Order Parameters with Euclidean Neural Networks

1 code implementation4 Jul 2020 Tess E. Smidt, Mario Geiger, Benjamin Kurt Miller

Curie's principle states that "when effects show certain asymmetry, this asymmetry must be found in the causes that gave rise to them".

Disentangling feature and lazy training in deep neural networks

no code implementations19 Jun 2019 Mario Geiger, Stefano Spigler, Arthur Jacot, Matthieu Wyart

Two distinct limits for deep learning have been derived as the network width $h\rightarrow \infty$, depending on how the weights of the last layer scale with $h$.

Asymptotic learning curves of kernel methods: empirical data v.s. Teacher-Student paradigm

no code implementations26 May 2019 Stefano Spigler, Mario Geiger, Matthieu Wyart

We extract $a$ from real data by performing kernel PCA, leading to $\beta\approx0. 36$ for MNIST and $\beta\approx0. 07$ for CIFAR10, in good agreement with observations.

regression

A General Theory of Equivariant CNNs on Homogeneous Spaces

no code implementations NeurIPS 2019 Taco Cohen, Mario Geiger, Maurice Weiler

Feature maps in these networks represent fields on a homogeneous base space, and layers are equivariant maps between spaces of fields.

General Classification

A jamming transition from under- to over-parametrization affects loss landscape and generalization

no code implementations22 Oct 2018 Stefano Spigler, Mario Geiger, Stéphane d'Ascoli, Levent Sagun, Giulio Biroli, Matthieu Wyart

We argue that in fully-connected networks a phase transition delimits the over- and under-parametrized regimes where fitting can or cannot be achieved.

Comparing Dynamics: Deep Neural Networks versus Glassy Systems

no code implementations ICML 2018 Marco Baity-Jesi, Levent Sagun, Mario Geiger, Stefano Spigler, Gerard Ben Arous, Chiara Cammarota, Yann Lecun, Matthieu Wyart, Giulio Biroli

We analyze numerically the training dynamics of deep neural networks (DNN) by using methods developed in statistical physics of glassy systems.

Intertwiners between Induced Representations (with Applications to the Theory of Equivariant Neural Networks)

1 code implementation28 Mar 2018 Taco S. Cohen, Mario Geiger, Maurice Weiler

In algebraic terms, the feature spaces in regular G-CNNs transform according to a regular representation of the group G, whereas the feature spaces in Steerable G-CNNs transform according to the more general induced representations of G. In order to make the network equivariant, each layer in a G-CNN is required to intertwine between the induced representations associated with its input and output space.

Spherical CNNs

3 code implementations ICLR 2018 Taco S. Cohen, Mario Geiger, Jonas Koehler, Max Welling

Convolutional Neural Networks (CNNs) have become the method of choice for learning problems involving 2D planar images.

Computational Efficiency regression

Convolutional Networks for Spherical Signals

2 code implementations14 Sep 2017 Taco Cohen, Mario Geiger, Jonas Köhler, Max Welling

Many areas of science and egineering deal with signals with other symmetries, such as rotation invariant data on the sphere.

General Classification Translation

Cannot find the paper you are looking for? You can Submit a new open access paper.