no code implementations • 18 Mar 2024 • Pere Verges, Igor Nunes, Mike Heddes, Tony Givargis, Alexandru Nicolau
Our work introduces an innovative approach to graph learning by leveraging Hyperdimensional Computing.
1 code implementation • 12 Jan 2024 • Mike Heddes, Narayan Srinivasa, Tony Givargis, Alexandru Nicolau
Sparsification of ANNs is often motivated by time, memory and energy savings only during model inference, yielding no benefits during training.
1 code implementation • 27 May 2023 • Igor Nunes, Mike Heddes, Pere Vergés, Danny Abraham, Alexander Veidenbaum, Alexandru Nicolau, Tony Givargis
DotHash can be used to estimate the Jaccard index and, to the best of our knowledge, is the first method that can also estimate the Adamic-Adar index and a family of related metrics.
1 code implementation • 24 Apr 2023 • Pere Vergés, Mike Heddes, Igor Nunes, Tony Givargis, Alexandru Nicolau
The experiments were run on four different machines, including different hyperparameter configurations, and the results were compared to a popular prototyping library built on PyTorch.
1 code implementation • 18 May 2022 • Mike Heddes, Igor Nunes, Pere Vergés, Denis Kleyko, Danny Abraham, Tony Givargis, Alexandru Nicolau, Alexander Veidenbaum
Hyperdimensional computing (HD), also known as vector symbolic architectures (VSA), is a framework for computing with distributed representations by exploiting properties of random high-dimensional vector spaces.
no code implementations • 16 May 2022 • Igor Nunes, Mike Heddes, Tony Givargis, Alexandru Nicolau
Hyperdimensional Computing (HDC) is a computation framework based on properties of high-dimensional random spaces.
1 code implementation • 16 May 2022 • Igor Nunes, Mike Heddes, Tony Givargis, Alexandru Nicolau, Alex Veidenbaum
HDC exploits characteristics of biological neural systems such as high-dimensionality, randomness and a holographic representation of information to achieve a good balance between accuracy, efficiency and robustness.