Federated Learning for Distributed Energy-Efficient Resource Allocation

20 Apr 2022  ·  Zelin Ji, Zhijin Qin ·

In cellular networks, resource allocation is performed in a centralized way, which brings huge computation complexity to the base station (BS) and high transmission overhead. This paper investigates the distributed resource allocation scheme for cellular networks to maximize the energy efficiency of the system in the uplink transmission, while guaranteeing the quality of service (QoS) for cellular users. Particularly, to cope the fast varying channels in wireless communication environment, we propose a robust federated reinforcement learning (FRL_suc) framework to enable local users to perform distributed resource allocation in items of transmit power and channel assignment by the guidance of the local neural network trained at each user. Analysis and numerical results show that the proposed FRL_suc framework can lower the transmission overhead and offload the computation from the central server to the local users, while outperforming the conventional multi-agent reinforcement learning algorithm in terms of EE, and is more robust to channel variations.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods