no code implementations • 3 Feb 2024 • Le Chen, Nesreen K. Ahmed, Akash Dutta, Arijit Bhattacharjee, Sixing Yu, Quazi Ishtiaque Mahmud, Waqwoya Abebe, Hung Phan, Aishwarya Sarkar, Branden Butler, Niranjan Hasabnis, Gal Oren, Vy A. Vo, Juan Pablo Munoz, Theodore L. Willke, Tim Mattson, Ali Jannesari
Recently, language models (LMs), especially large language models (LLMs), have revolutionized the field of deep learning.
no code implementations • 30 Sep 2023 • Sixing Yu, J. Pablo Muñoz, Ali Jannesari
This is evident across tasks in both natural language processing and computer vision domains.
no code implementations • 19 May 2023 • Sixing Yu, J. Pablo Muñoz, Ali Jannesari
Foundation Models (FMs), such as LLaMA, BERT, GPT, ViT, and CLIP, have demonstrated remarkable success in a wide range of applications, driven by their ability to leverage vast amounts of data for pre-training.
no code implementations • 9 Nov 2022 • Sixing Yu, J. Pablo Muñoz, Ali Jannesari
To address these challenges, we propose Resource-aware Federated Learning (RaFL).
no code implementations • 16 Aug 2022 • Duy Phuong Nguyen, Sixing Yu, J. Pablo Muñoz, Ali Jannesari
This method allows efficient multi-model knowledge fusion and the deployment of resource-aware models while preserving model heterogeneity.
1 code implementation • 29 Nov 2021 • Sixing Yu, Phuong Nguyen, Waqwoya Abebe, Wei Qian, Ali Anwar, Ali Jannesari
Federated learning~(FL) facilitates the training and deploying AI models on edge devices.
no code implementations • 13 Jun 2021 • Sixing Yu, Phuong Nguyen, Ali Anwar, Ali Jannesari
Our approach reduces up to 50\% FLOPs inference of DNNs on edge devices while maintaining the model's quality.
1 code implementation • 5 Feb 2021 • Sixing Yu, Arya Mazaheri, Ali Jannesari
Model compression is an essential technique for deploying deep neural networks (DNNs) on power and memory-constrained resources.
no code implementations • ICCV 2021 • Sixing Yu, Arya Mazaheri, Ali Jannesari
We compared our method with rule-based DNN embedding model compression methods to show the effectiveness of our method.