Search Results for author: Samir Khaki

Found 8 papers, 7 papers with code

ATOM: Attention Mixer for Efficient Dataset Distillation

1 code implementation2 May 2024 Samir Khaki, Ahmad Sajedi, Kai Wang, Lucy Z. Liu, Yuri A. Lawryshyn, Konstantinos N. Plataniotis

To address these challenges in dataset distillation, we propose the ATtentiOn Mixer (ATOM) module to efficiently distill large datasets using a mixture of channel and spatial-wise attention in the feature matching process.

Neural Architecture Search

The Need for Speed: Pruning Transformers with One Recipe

1 code implementation26 Mar 2024 Samir Khaki, Konstantinos N. Plataniotis

We introduce the $\textbf{O}$ne-shot $\textbf{P}$runing $\textbf{T}$echnique for $\textbf{I}$nterchangeable $\textbf{N}$etworks ($\textbf{OPTIN}$) framework as a tool to increase the efficiency of pre-trained transformer architectures $\textit{without requiring re-training}$.

Image Classification Semantic Segmentation +1

DataDAM: Efficient Dataset Distillation with Attention Matching

2 code implementations ICCV 2023 Ahmad Sajedi, Samir Khaki, Ehsan Amjadian, Lucy Z. Liu, Yuri A. Lawryshyn, Konstantinos N. Plataniotis

Emerging research on dataset distillation aims to reduce training costs by creating a small synthetic set that contains the information of a larger real dataset and ultimately achieves test accuracy equivalent to a model trained on the whole dataset.

Continual Learning Neural Architecture Search

CFDP: Common Frequency Domain Pruning

1 code implementation7 Jun 2023 Samir Khaki, Weihan Luo

In this paper, we introduce a novel end-to-end pipeline for model pruning via the frequency domain.

CONetV2: Efficient Auto-Channel Size Optimization for CNNs

1 code implementation13 Oct 2021 Yi Ru Wang, Samir Khaki, Weihang Zheng, Mahdi S. Hosseini, Konstantinos N. Plataniotis

Neural Architecture Search (NAS) has been pivotal in finding optimal network configurations for Convolution Neural Networks (CNNs).

Knowledge Distillation Neural Architecture Search

Cannot find the paper you are looking for? You can Submit a new open access paper.