Search Results for author: Yitzchak Shmalo

Found 3 papers, 1 papers with code

Enhancing Accuracy in Deep Learning Using Random Matrix Theory

no code implementations4 Oct 2023 Leonid Berlyand, Etienne Sandier, Yitzchak Shmalo, Lei Zhang

We explore the applications of random matrix theory (RMT) in the training of deep neural networks (DNNs), focusing on layer pruning that is reducing the number of DNN parameters (weights).

Deep Learning Weight Pruning with RMT-SVD: Increasing Accuracy and Reducing Overfitting

1 code implementation15 Mar 2023 Yitzchak Shmalo, Jonathan Jenkins, Oleksii Krupchytskyi

Recently, random matrix theory (RMT) has been applied to the overfitting problem in deep learning.

Stability of Accuracy for the Training of DNNs Via the Uniform Doubling Condition

no code implementations16 Oct 2022 Yitzchak Shmalo

A recent result by Berlyand, Jabin, and Safsten introduces a doubling condition on the training data, which ensures the stability of accuracy during training for DNNs using the absolute value activation function.

Cannot find the paper you are looking for? You can Submit a new open access paper.