Search Results for author: Mahdi Zolnouri

Found 5 papers, 1 papers with code

Efficient Training Under Limited Resources

1 code implementation23 Jan 2023 Mahdi Zolnouri, Dounia Lakhmiri, Christophe Tribes, Eyyüb Sari, Sébastien Le Digabel

Training time budget and size of the dataset are among the factors affecting the performance of a Deep Neural Network (DNN).

Data Augmentation Neural Architecture Search

Rethinking Pareto Frontier for Performance Evaluation of Deep Neural Networks

no code implementations18 Feb 2022 Vahid Partovi Nia, Alireza Ghaffari, Mahdi Zolnouri, Yvon Savaria

We propose to use a multi-dimensional Pareto frontier to re-define the efficiency measure of candidate deep learning models, where several variables such as training cost, inference latency, and accuracy play a relative role in defining a dominant model.

Benchmarking Image Classification

Demystifying and Generalizing BinaryConnect

no code implementations NeurIPS 2021 Tim Dockhorn, YaoLiang Yu, Eyyüb Sari, Mahdi Zolnouri, Vahid Partovi Nia

BinaryConnect (BC) and its many variations have become the de facto standard for neural network quantization.

Quantization

Importance of Data Loading Pipeline in Training Deep Neural Networks

no code implementations21 Apr 2020 Mahdi Zolnouri, Xinlin Li, Vahid Partovi Nia

Training large-scale deep neural networks is a long, time-consuming operation, often requiring many GPUs to accelerate.

Data Augmentation

Cannot find the paper you are looking for? You can Submit a new open access paper.