Search Results for author: Alessandro Favero

Found 7 papers, 5 papers with code

Multi-Modal Hallucination Control by Visual Information Grounding

no code implementations20 Mar 2024 Alessandro Favero, Luca Zancato, Matthew Trager, Siddharth Choudhary, Pramuditha Perera, Alessandro Achille, Ashwin Swaminathan, Stefano Soatto

In particular, we show that as more tokens are generated, the reliance on the visual prompt decreases, and this behavior strongly correlates with the emergence of hallucinations.

Hallucination Visual Question Answering (VQA)

A Phase Transition in Diffusion Models Reveals the Hierarchical Nature of Data

1 code implementation26 Feb 2024 Antonio Sclocchi, Alessandro Favero, Matthieu Wyart

We find that the backward diffusion process acting after a time $t$ is governed by a phase transition at some threshold time, where the probability of reconstructing high-level features, like the class of an image, suddenly drops.

How Deep Neural Networks Learn Compositional Data: The Random Hierarchy Model

1 code implementation5 Jul 2023 Francesco Cagnetta, Leonardo Petrini, Umberto M. Tomasini, Alessandro Favero, Matthieu Wyart

The model is a classification task where each class corresponds to a group of high-level features, chosen among several equivalent groups associated with the same class.

Task Arithmetic in the Tangent Space: Improved Editing of Pre-Trained Models

1 code implementation NeurIPS 2023 Guillermo Ortiz-Jimenez, Alessandro Favero, Pascal Frossard

Task arithmetic has recently emerged as a cost-effective and scalable approach to edit pre-trained models directly in weight space: By adding the fine-tuned weights of different tasks, the model's performance can be improved on these tasks, while negating them leads to task forgetting.

Disentanglement

What Can Be Learnt With Wide Convolutional Neural Networks?

1 code implementation1 Aug 2022 Francesco Cagnetta, Alessandro Favero, Matthieu Wyart

Interestingly, we find that, despite their hierarchical structure, the functions generated by infinitely-wide deep CNNs are too rich to be efficiently learnable in high dimension.

Locality defeats the curse of dimensionality in convolutional teacher-student scenarios

no code implementations NeurIPS 2021 Alessandro Favero, Francesco Cagnetta, Matthieu Wyart

Convolutional neural networks perform a local and translationally-invariant treatment of the data: quantifying which of these two aspects is central to their success remains a challenge.

regression

Cannot find the paper you are looking for? You can Submit a new open access paper.