Search Results for author: Rui Heng Yang

Found 3 papers, 1 papers with code

DenseShift: Towards Accurate and Efficient Low-Bit Power-of-Two Quantization

1 code implementation ICCV 2023 Xinlin Li, Bang Liu, Rui Heng Yang, Vanessa Courville, Chao Xing, Vahid Partovi Nia

We further propose a sign-scale decomposition design to enhance training efficiency and a low-variance random initialization strategy to improve the model's transfer learning performance.

Quantization Transfer Learning

Tensor train decompositions on recurrent networks

no code implementations9 Jun 2020 Alejandro Murua, Ramchalam Ramakrishnan, Xinlin Li, Rui Heng Yang, Vahid Partovi Nia

Recurrent neural networks (RNN) such as long-short-term memory (LSTM) networks are essential in a multitude of daily live tasks such as speech, language, video, and multimodal learning.

Cannot find the paper you are looking for? You can Submit a new open access paper.