Conference paper Open Access

Reinforcement Learning for Delay-Constrained Energy-Aware Small Cells with Multi-Sleeping Control

Dini, Paolo; El Amine, Ali; Nuaymi, Loutfi

In 5G networks, specific requirements are defined on the periodicity of Synchronization Signaling (SS) bursts. This imposes a constraint on the maximum period a Base Station (BS) can be deactivated. On the other hand, BS densification is expected in 5G architecture. This will cause a drastic increase in the network energy consumption followed by a complex interference management. In this paper, we study the Energy-Delay-Tradeoff (EDT) problem in a Heterogeneous Network (HetNet) where small cells can switch to different sleep mode levels to save energy while maintaining a good Quality of Service (QoS). We propose a distributed Q-learning algorithm controller for small cells that adapts the cell activity while taking into account the co-channel interference between the cells. Our numerical results show that multi-level sleep scheme outperforms binary sleep scheme with an energy saving up to 80% in the case when the users are delay tolerant, and while respecting the periodicity of the SS bursts in 5G.

Grant numbers : 5G-REFINE - Resource EfFIcient 5G NEtworks (TEC2017-88373-R).© 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Files (309.0 kB)
Name Size
Reinforcement Learning for Delay.pdf
md5:d164844a65cede5b82f3e27cfd64e11c
309.0 kB Download
30
38
views
downloads
Views 30
Downloads 38
Data volume 11.7 MB
Unique views 30
Unique downloads 38

Share

Cite as