Low-rank compression

8 papers with code • 0 benchmarks • 0 datasets

This task has no description! Would you like to contribute one?

Libraries

Use these libraries to find Low-rank compression models and implementations

Most implemented papers

Domain-adaptive deep network compression

mmasana/DALR ICCV 2017

We show that domain transfer leads to large shifts in network activations and that it is desirable to take this into account when compressing.

Compressing Neural Networks: Towards Determining the Optimal Layer-wise Decomposition

lucaslie/torchprune NeurIPS 2021

We present a novel global compression framework for deep neural networks that automatically analyzes each layer to identify the optimal per-layer compression ratio, while simultaneously achieving the desired overall compression.

Decomposable-Net: Scalable Low-Rank Compression for Neural Networks

ygcats/Scalable-Low-Rank-Compression-for-Neural-Networks 29 Oct 2019

Compressing DNNs is important for the real-world applications operating on resource-constrained devices.

A flexible, extensible software framework for model compression based on the LC algorithm

UCMerced-ML/LC-model-compression 15 May 2020

We propose a software framework based on the ideas of the Learning-Compression (LC) algorithm, that allows a user to compress a neural network or other machine learning model using different compression schemes with minimal effort.

Low-Rank Compression of Neural Nets: Learning the Rank of Each Layer

UCMerced-ML/LC-model-compression CVPR 2020

Neural net compression can be achieved by approximating each layer's weight matrix by a low-rank matrix.

Model compression as constrained optimization, with application to neural nets. Part V: combining compressions

UCMerced-ML/LC-model-compression 9 Jul 2021

However, VGG nets can be better compressed by combining low-rank with a few floating point weights.

Compact Model Training by Low-Rank Projection with Energy Transfer

bzqlin/lrpet 12 Apr 2022

In this paper, we devise a new training method, low-rank projection with energy transfer (LRPET), that trains low-rank compressed networks from scratch and achieves competitive performance.

TT-NF: Tensor Train Neural Fields

toshas/ttnf 30 Sep 2022

Learning neural fields has been an active topic in deep learning research, focusing, among other issues, on finding more compact and easy-to-fit representations.