no code implementations • 18 Mar 2024 • Ali Asghar Sharifi, Ali Zoljodi, Masoud Daneshtalab
Through empirical studies, TrajectoryNAS demonstrates its effectiveness in enhancing the performance of autonomous driving systems, marking a significant advancement in the field. Experimental results reveal that TrajcetoryNAS yield a minimum of 4. 8 higger accuracy and 1. 1* lower latency over competing methods on the NuScenes dataset.
no code implementations • 5 Mar 2024 • Mahdi Taheri, Natalia Cherezova, Samira Nazari, Ahsan Rafiq, Ali Azarpeyvand, Tara Ghasempouri, Masoud Daneshtalab, Jaan Raik, Maksim Jenihhin
In this paper, we propose an architecture of a novel adaptive fault-tolerant approximate multiplier tailored for ASIC-based DNN accelerators.
no code implementations • 5 Mar 2024 • Mahdi Taheri, Masoud Daneshtalab, Jaan Raik, Maksim Jenihhin, Salvatore Pappalardo, Paul Jimenez, Bastien Deveautour, Alberto Bosio
Systolic array has emerged as a prominent architecture for Deep Neural Network (DNN) hardware accelerators, providing high-throughput and low-latency performance essential for deploying DNNs across diverse applications.
no code implementations • 17 Jan 2024 • Mahdi Taheri, Natalia Cherezova, Mohammad Saeed Ansari, Maksim Jenihhin, Ali Mahani, Masoud Daneshtalab, Jaan Raik
The stringent requirements for the Deep Neural Networks (DNNs) accelerator's reliability stand along with the need for reducing the computational burden on the hardware platforms, i. e. reducing the energy consumption and execution time as well as increasing the efficiency of DNN accelerators.
1 code implementation • 16 Aug 2023 • Ali Zoljodi, Sadegh Abadijou, Mina Alibeigi, Masoud Daneshtalab
CLLD is a novel multitask contrastive learning that trains lane detection approaches to detect lane markings even in low visible situations by integrating local feature contrastive learning (CL) with our new proposed operation cross-similarity.
Ranked #4 on Lane Detection on TuSimple
no code implementations • ADBIS 2023 2023 • Seyed Ali Mousavi, Hamid Mousavi2, Masoud Daneshtalab
Finally, we proposed a fair adversarial retraining method (FARMUR) to mitigate unfairness in robustness that retrains the DNN models based on vulnerable and robust sub-partitions.
no code implementations • 16 Jun 2023 • Mohammad Hasan Ahmadilivani, Mahdi Taheri, Jaan Raik, Masoud Daneshtalab, Maksim Jenihhin
Thereafter, a novel method for splitting the critical neurons is proposed that enables the design of a Lightweight Correction Unit (LCU) in the accelerator without redesigning its computational part.
no code implementations • 31 May 2023 • Mohammad Hasan Ahmadilivani, Mario Barbareschi, Salvatore Barone, Alberto Bosio, Masoud Daneshtalab, Salvatore Della Torca, Gabriele Gavarini, Maksim Jenihhin, Jaan Raik, Annachiara Ruospo, Ernesto Sanchez, Mahdi Taheri
We propose to use approximate (AxC) arithmetic circuits to agilely emulate errors in hardware without performing fault injection on the DNN.
no code implementations • 31 May 2023 • Mahdi Taheri, Mohammad Hasan Ahmadilivani, Maksim Jenihhin, Masoud Daneshtalab, Jaan Raik
Nowadays, the extensive exploitation of Deep Neural Networks (DNNs) in safety-critical applications raises new reliability concerns.
no code implementations • 9 May 2023 • Mohammad Hasan Ahmadilivani, Mahdi Taheri, Jaan Raik, Masoud Daneshtalab, Maksim Jenihhin
Through this SLR, three kinds of methods for reliability assessment of DNNs are identified including Fault Injection (FI), Analytical, and Hybrid methods.
no code implementations • 14 Mar 2023 • Mahdi Taheri, Mohammad Riazati, Mohammad Hasan Ahmadilivani, Maksim Jenihhin, Masoud Daneshtalab, Jaan Raik, Mikael Sjodin, Bjorn Lisper
The framework enables selective approximation of reliability-critical DNNs, providing a set of Pareto-optimal DNN implementation design space points for the target resource utilization requirements.
no code implementations • 13 Mar 2023 • Mohammad Hasan Ahmadilivani, Mahdi Taheri, Jaan Raik, Masoud Daneshtalab, Maksim Jenihhin
In this work, we propose a novel accurate, fine-grain, metric-oriented, and accelerator-agnostic method called DeepVigor that provides vulnerability value ranges for DNN neurons' outputs.
no code implementations • 17 Jan 2023 • Mehdi Asadi, Fatemeh Poursalim, Mohammad Loni, Masoud Daneshtalab, Mikael Sjödin, Arash Gharehbaghi
This paper presents a novel machine learning framework for detecting Paroxysmal Atrial Fibrillation (PxAF), a pathological characteristic of Electrocardiogram (ECG) that can lead to fatal conditions such as heart attack.
1 code implementation • 8 Dec 2022 • Hamidreza Mahini, Hamid Mousavi, Masoud Daneshtalab
GTFLAT, as a game theory-based add-on, addresses an important research question: How can a federated learning algorithm achieve better performance and training efficiency by setting more effective adaptive weights for averaging in the model aggregation phase?
1 code implementation • 14 Jul 2022 • Hamid Mousavi, Mohammad Loni, Mina Alibeigi, Masoud Daneshtalab
In this paper, we propose a new method to search for sparsity-friendly neural architectures.
1 code implementation • Design, Automation and Test in Europe Conference (DATE) 2022 • Mohammad Loni, Hamid Mousavi, Mohammad Riazati, Masoud Daneshtalab, and Mikael Sjodin
This paper proposes TAS, a framework that drastically reduces the accuracy gap between TNNs and their full-precision counterparts by integrating quantization into the network design.
no code implementations • 19 Apr 2020 • Seyed Ahmad Mirsalari, Sima Sinaei, Mostafa E. Salehi, Masoud Daneshtalab
Recurrent Neural Networks (RNN) are widely used for learning sequences in applications such as EEG classification.
no code implementations • 19 Apr 2020 • Najmeh Nazari, Seyed Ahmad Mirsalari, Sima Sinaei, Mostafa E. Salehi, Masoud Daneshtalab
Long Short-Term Memory (LSTM) is widely used in various sequential applications.
no code implementations • 5 Feb 2016 • Sergei Dytckov, Masoud Daneshtalab
If spike-driven applications, minimizing an amount of spikes, are developed, spiking neural systems may reach the energy efficiency level of classical neural systems.