Skip to main content

Energy-Efficient Ultra-Dense Network using Deep Reinforcement Learning

Hyungyu Ju, Seungnyun Kim, YoungJoon Kim, Hyojin Lee, Byonghyo Shim

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 17:23
27 May 2020

With the explosive growth in mobile data traffic, pursuing energy efficiency has become one of key challenges for the next generation communication systems.
In recent years, an approach to reduce the energy consumption of base stations (BSs) by selectively turning off the BSs, referred to as the sleep mode technique, has been suggested. However, due to the macro-cell oriented network operation and also computational overhead, this approach has not been so successful in the past.
In this paper, we propose an approach to determine the BS active/sleep mode of ultra-dense network (UDN) using deep reinforcement learning (DRL).
A key ingredient of the proposed scheme is to use action elimination network to reduce the wide action space (active/sleep mode selection).
Numerical results show that the proposed scheme can significantly reduce the energy consumption of UDN while ensuring the QoS requirement of the network.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00