1 code implementation • 25 Jan 2023 • Gennaro Gala, Daniele Grattarola, Erik Quaeghebeur
Cellular automata (CAs) are computational models exhibiting rich dynamics emerging from the local interaction of cells arranged in a regular lattice.
2 code implementations • 31 May 2022 • Daniele Grattarola, Pierre Vandergheynst
We consider the problem of learning implicit neural representations (INRs) for signals on non-Euclidean domains.
1 code implementation • 21 Mar 2022 • Zhiqiang Zhong, Guadalupe Gonzalez, Daniele Grattarola, Jun Pang
Here, we formulate the unsupervised NE task as an r-ego network discrimination problem and develop the SELENE framework for learning on networks with homophily and heterophily.
1 code implementation • NeurIPS 2021 • Daniele Grattarola, Lorenzo Livi, Cesare Alippi
Cellular automata (CA) are a class of computational models that exhibit rich dynamics emerging from the local interaction of cells arranged in a regular lattice.
2 code implementations • 11 Oct 2021 • Daniele Grattarola, Daniele Zambon, Filippo Maria Bianchi, Cesare Alippi
Inspired by the conventional pooling layers in convolutional neural networks, many recent works in the field of graph machine learning have introduced pooling operators to reduce the size of graphs.
1 code implementation • ICLR 2021 • Benjamin Paassen, Daniele Grattarola, Daniele Zambon, Cesare Alippi, Barbara Eva Hammer
With this result, we hope to provide a firm theoretical basis for a next generation of time series prediction models.
1 code implementation • 22 Jun 2020 • Daniele Grattarola, Cesare Alippi
In this paper we present Spektral, an open-source Python library for building graph neural networks with TensorFlow and the Keras application programming interface.
1 code implementation • 24 Oct 2019 • Filippo Maria Bianchi, Daniele Grattarola, Lorenzo Livi, Cesare Alippi
In graph neural networks (GNNs), pooling operators compute local summaries of input graphs to capture their global properties, and they are fundamental for building deep GNNs that learn hierarchical representations.
Ranked #1 on Graph Classification on Bench-hard
no code implementations • 25 Sep 2019 • Filippo Maria Bianchi, Daniele Grattarola, Cesare Alippi
For each node, our method learns a soft cluster assignment vector that depends on the node features, the target inference task (e. g., a graph classification loss), and, thanks to the minCut objective, also on the connectivity structure of the graph.
4 code implementations • ICML 2020 • Filippo Maria Bianchi, Daniele Grattarola, Cesare Alippi
Spectral clustering (SC) is a popular clustering technique to find strongly connected communities on a graph.
2 code implementations • 18 Mar 2019 • Daniele Zambon, Daniele Grattarola, Lorenzo Livi, Cesare Alippi
This paper proposes an autoregressive (AR) model for sequences of graphs, which generalises traditional AR models.
1 code implementation • 5 Jan 2019 • Filippo Maria Bianchi, Daniele Grattarola, Lorenzo Livi, Cesare Alippi
Popular graph neural networks implement convolution operations on graphs based on polynomial spectral filters.
Ranked #4 on Skeleton Based Action Recognition on SBU
1 code implementation • 11 Dec 2018 • Daniele Grattarola, Lorenzo Livi, Cesare Alippi
Constant-curvature Riemannian manifolds (CCMs) have been shown to be ideal embedding spaces in many application domains, as their non-Euclidean geometry can naturally account for some relevant properties of data, like hierarchy and circularity.
1 code implementation • 16 May 2018 • Daniele Grattarola, Daniele Zambon, Cesare Alippi, Lorenzo Livi
A common approach is to use embedding techniques to represent graphs as points in a conventional Euclidean space, but non-Euclidean spaces have often been shown to be better suited for embedding graphs.