no code implementations • 22 Dec 2023 • Mohamed Badi, Chaouki Ben Issaid, Anis Elgabli, Mehdi Bennis
The growing number of wireless edge devices has magnified challenges concerning energy, bandwidth, latency, and data heterogeneity.
no code implementations • 29 Aug 2022 • Chaouki Ben Issaid, Anis Elgabli, Mehdi Bennis
In this paper, we propose to solve a regularized distributionally robust learning problem in the decentralized setting, taking into account the data distribution shift.
1 code implementation • 17 Jun 2022 • Anis Elgabli, Chaouki Ben Issaid, Amrit S. Bedi, Ketan Rajawat, Mehdi Bennis, Vaneet Aggarwal
Newton-type methods are popular in federated learning due to their fast convergence.
no code implementations • 2 Jun 2021 • Mounssif Krouka, Anis Elgabli, Chaouki Ben Issaid, Mehdi Bennis
In this paper, we propose a technique to reduce the total energy bill at the edge device by utilizing model compression and time-varying model split between the edge and remote nodes.
no code implementations • 2 Jun 2021 • Mounssif Krouka, Anis Elgabli, Chaouki Ben Issaid, Mehdi Bennis
Split-learning (SL) has recently gained popularity due to its inherent privacy-preserving capabilities and ability to enable collaborative inference for devices with limited computational power.
no code implementations • 31 May 2021 • Anis Elgabli, Chaouki Ben Issaid, Amrit S. Bedi, Mehdi Bennis, Vaneet Aggarwal
In this paper, we propose an energy-efficient federated meta-learning framework.
no code implementations • 12 Nov 2020 • Mounssif Krouka, Anis Elgabli, Mohammed S. Elbamby, Cristina Perfecto, Mehdi Bennis, Vaneet Aggarwal
Wirelessly streaming high quality 360 degree videos is still a challenging problem.
no code implementations • 9 Nov 2020 • Tamara Alshammari, Sumudu Samarakoon, Anis Elgabli, Mehdi Bennis
This article deals with the problem of distributed machine learning, in which agents update their models based on their local datasets, and aggregate the updated models collaboratively and in a fully decentralized manner.
no code implementations • 14 Sep 2020 • Chaouki Ben Issaid, Anis Elgabli, Jihong Park, Mehdi Bennis, Mérouane Debbah
In this paper, we propose a communication-efficiently decentralized machine learning framework that solves a consensus optimization problem defined over a network of inter-connected workers.
no code implementations • 6 Aug 2020 • Jihong Park, Sumudu Samarakoon, Anis Elgabli, Joongheon Kim, Mehdi Bennis, Seong-Lyun Kim, Mérouane Debbah
Machine learning (ML) is a promising enabler for the fifth generation (5G) communication systems and beyond.
no code implementations • 3 Jul 2020 • Anis Elgabli, Jihong Park, Chaouki Ben Issaid, Mehdi Bennis
Wireless connectivity is instrumental in enabling scalable federated learning (FL), yet wireless channels bring challenges for model training, in which channel randomness perturbs each worker's model update while multiple workers' updates incur significant interference under limited bandwidth.
no code implementations • 22 Jan 2020 • Hamza Khan, Anis Elgabli, Sumudu Samarakoon, Mehdi Bennis, Choong Seon Hong
Vehicle-to-everything (V2X) communication is a growing area of communication with a variety of use cases.
no code implementations • 9 Nov 2019 • Anis Elgabli, Jihong Park, Sabbir Ahmed, Mehdi Bennis
This article proposes a communication-efficient decentralized deep learning algorithm, coined layer-wise federated group ADMM (L-FGADMM).
no code implementations • 23 Oct 2019 • Anis Elgabli, Jihong Park, Amrit S. Bedi, Chaouki Ben Issaid, Mehdi Bennis, Vaneet Aggarwal
In this article, we propose a communication-efficient decentralized machine learning (ML) algorithm, coined quantized group ADMM (Q-GADMM).
no code implementations • 30 Aug 2019 • Anis Elgabli, Jihong Park, Amrit S. Bedi, Mehdi Bennis, Vaneet Aggarwal
When the data is distributed across multiple servers, lowering the communication cost between the servers (or workers) while solving the distributed learning problem is an important problem and is the focus of this paper.
no code implementations • 16 Aug 2019 • Jihong Park, Shiqiang Wang, Anis Elgabli, Seungeun Oh, Eunjeong Jeong, Han Cha, Hyesung Kim, Seong-Lyun Kim, Mehdi Bennis
Devices at the edge of wireless networks are the last mile data sources for machine learning (ML).
1 code implementation • 2 Mar 2019 • Anis Elgabli, Ali Elghariani, Vaneet Aggarwal, *Mehdi Bennis, Mark R. Bell
We introduce an objective function that is a sum of strictly convex and separable functions based on decomposing the received vector into multiple vectors.
Information Theory Information Theory
1 code implementation • 7 Jun 2018 • Anis Elgabli, Vaneet Aggarwal
For example, on an experiment conducted over 100 real cellular bandwidth traces of a public dataset that spans different bandwidth regimes, our proposed algorithm (FastScan) achieves the minimum re-buffering (stall) time and the maximum average playback rate in every single trace as compared to the original dash. js rate adaptation scheme, Festive, BBA, RB, and FastMPC algorithms.
Networking and Internet Architecture Multimedia
no code implementations • 30 Apr 2018 • Anis Elgabli, Vaneet Aggarwal, Shuai Hao, Feng Qian, Subhabrata Sen
The objective is to optimize a novel QoE metric that models a combination of the three objectives of minimizing the stall/skip duration of the video, maximizing the playback quality of every chunk, and minimizing the number of quality switches.
Networking and Internet Architecture Multimedia