Linear Mode Connectivity

10 papers with code • 0 benchmarks • 0 datasets

Linear Mode Connectivity refers to the relationship between input and output variables in a linear regression model. In a linear regression model, input variables are combined with weights to predict output variables. Understanding the linear model connectivity can help interpret model results and identify which input variables are most important for predicting output variables.

Most implemented papers

Git Re-Basin: Merging Models modulo Permutation Symmetries

samuela/git-re-basin 11 Sep 2022

The success of deep learning is due in large part to our ability to solve certain massive non-convex optimization problems with relative ease.

Linear Mode Connectivity and the Lottery Ticket Hypothesis

facebookresearch/open_lth ICML 2020

We study whether a neural network optimizes to the same, linearly connected minimum under different samples of SGD noise (e. g., random data order and augmentation).

Proving Linear Mode Connectivity of Neural Networks via Optimal Transport

damienferbach/ot_lmc 29 Oct 2023

The energy landscape of high-dimensional non-convex optimization problems is crucial to understanding the effectiveness of modern deep neural network architectures.

Linear Mode Connectivity in Multitask and Continual Learning

imirzadeh/MC-SGD ICLR 2021

Continual (sequential) training and multitask (simultaneous) training are often attempting to solve the same overall objective: to find a solution that performs well on all considered tasks.

The Role of Permutation Invariance in Linear Mode Connectivity of Neural Networks

rahimentezari/permutationinvariance ICLR 2022

In this paper, we conjecture that if the permutation invariance of neural networks is taken into account, SGD solutions will likely have no barrier in the linear interpolation between them.

Wasserstein Barycenter-based Model Fusion and Linear Mode Connectivity of Neural Networks

sixuli/wbfusionandlmc 13 Oct 2022

In our framework, the fusion occurs in a layer-wise manner and builds on an interpretation of a node in a network as a function of the layer preceding it.

Re-basin via implicit Sinkhorn differentiation

fagp/sinkhorn-rebasin CVPR 2023

The recent emergence of new algorithms for permuting models into functionally equivalent regions of the solution space has shed some light on the complexity of error surfaces, and some promising properties like mode connectivity.

Lottery Tickets in Evolutionary Optimization: On Sparse Backpropagation-Free Trainability

roberttlange/es-lottery 31 May 2023

Is the lottery ticket phenomenon an idiosyncrasy of gradient-based training or does it generalize to evolutionary optimization?

Layer-wise Linear Mode Connectivity

link-er/layer-wise-lmc 13 Jul 2023

Averaging neural network parameters is an intuitive method for fusing the knowledge of two independent models.

Rethink Model Re-Basin and the Linear Mode Connectivity

xingyuqu/rethink-re-basin 5 Feb 2024

Recent studies suggest that with sufficiently wide models, most SGD solutions can, up to permutation, converge into the same basin.