A Holistic Power Optimization Approach for Microgrid Control Based on Deep Reinforcement Learning

1 Mar 2024  ·  Fulong Yao, Wanqing Zhao, Matthew Forshaw, Yang song ·

The global energy landscape is undergoing a transformation towards decarbonization, sustainability, and cost-efficiency. In this transition, microgrid systems integrated with renewable energy sources (RES) and energy storage systems (ESS) have emerged as a crucial component. However, optimizing the operational control of such an integrated energy system lacks a holistic view of multiple environmental, infrastructural and economic considerations, not to mention the need to factor in the uncertainties from both the supply and demand. This paper presents a holistic datadriven power optimization approach based on deep reinforcement learning (DRL) for microgrid control considering the multiple needs of decarbonization, sustainability and cost-efficiency. First, two data-driven control schemes, namely the prediction-based (PB) and prediction-free (PF) schemes, are devised to formulate the control problem within a Markov decision process (MDP). Second, a multivariate objective (reward) function is designed to account for the market profits, carbon emissions, peak load, and battery degradation of the microgrid system. Third, we develop a Double Dueling Deep Q Network (D3QN) architecture to optimize the power flows for real-time energy management and determine charging/discharging strategies of ESS. Finally, extensive simulations are conducted to demonstrate the effectiveness and superiority of the proposed approach through a comparative analysis. The results and analysis also suggest the respective circumstances for using the two control schemes in practical implementations with uncertainties.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here