no code implementations • 2 Oct 2023 • Andreas Roth, Thomas Liebig
Our work explores this phenomenon in graph neural networks by investigating differences between models differing only in their initializations in their utilized features for predictions.
1 code implementation • 31 Aug 2023 • Andreas Roth, Thomas Liebig
Our study reveals new theoretical insights into over-smoothing and feature over-correlation in deep graph neural networks.
1 code implementation • 31 Aug 2023 • Cedric Sanders, Andreas Roth, Thomas Liebig
CurvPool exploits the notion of curvature of a graph to adaptively identify structures responsible for both over-smoothing and over-squashing.
no code implementations • 21 Nov 2022 • Andreas Roth, Thomas Liebig
Our framework can be combined with any spatio-temporal Graph Neural Network, that exploits spatio-temporal correlations with surrounding observed locations by using the network's graph structure.
1 code implementation • 1 Jul 2022 • Andreas Roth, Thomas Liebig
Popular graph neural networks are shallow models, despite the success of very deep architectures in other application domains of deep learning.
no code implementations • 25 Aug 2020 • Lukas Faber, Sandro Luck, Damian Pascual, Andreas Roth, Gino Brunner, Roger Wattenhofer
The automatic generation of medleys, i. e., musical pieces formed by different songs concatenated via smooth transitions, is not well studied in the current literature.