TY - GEN
T1 - DRL driven energy-efficient resource allocation for multimedia broadband services in mobile edge network
AU - Huo, Yonghua
AU - Song, Chunxiao
AU - Ji, Xillin
AU - Yang, Mo
AU - Yu, Peng
AU - Tao, Minxing
AU - Shi, Lei
N1 - Publisher Copyright:
© 2020 IEEE.
PY - 2020/10/27
Y1 - 2020/10/27
N2 - Traffic-intensive Multimedia Broadband Services (MBS) lead to the explosive mobile traffic growth in 5G network, and Mobile Edge Network(MEN) is a potential solution for it. Mobile edge computing network mainly provides users with ubiquitous computing support to meet the needs of delay-sensitive and computation-reinforcement services. Although mobile edge networks can provide advantages such as low latency, moving storage and computing resources down also leads to more complex resource management for mobile edge networks. Therefore, how to allocate resources such as bandwidth and power more efficiently and efficiently while meeting the needs of users has become an urgent problem to be solved. Though Deep Reinforcement Learning (DRL) has been used to a lot of aspects of studies related to edge networks, there lacks the applications for energy-efficient resource allocation. A Deep Reinforcement Learning (DRL) based energy-efficient resource allocation mechanism is proposed in this paper with the goal of efficiently allocating the resources while meeting the demands of each mobile user. The energy efficiency value could be obtained when the algorithm reaches convergence based on the analysis of the simulation results. The efficiency of the DRL-based mechanism and its effectiveness in meeting user requirements and implementing energy-efficient resource allocation are verified.
AB - Traffic-intensive Multimedia Broadband Services (MBS) lead to the explosive mobile traffic growth in 5G network, and Mobile Edge Network(MEN) is a potential solution for it. Mobile edge computing network mainly provides users with ubiquitous computing support to meet the needs of delay-sensitive and computation-reinforcement services. Although mobile edge networks can provide advantages such as low latency, moving storage and computing resources down also leads to more complex resource management for mobile edge networks. Therefore, how to allocate resources such as bandwidth and power more efficiently and efficiently while meeting the needs of users has become an urgent problem to be solved. Though Deep Reinforcement Learning (DRL) has been used to a lot of aspects of studies related to edge networks, there lacks the applications for energy-efficient resource allocation. A Deep Reinforcement Learning (DRL) based energy-efficient resource allocation mechanism is proposed in this paper with the goal of efficiently allocating the resources while meeting the demands of each mobile user. The energy efficiency value could be obtained when the algorithm reaches convergence based on the analysis of the simulation results. The efficiency of the DRL-based mechanism and its effectiveness in meeting user requirements and implementing energy-efficient resource allocation are verified.
KW - Deep Reinforcement Learning
KW - Energy-efficient Resource Allocation
KW - Mobile Edge Network
KW - Multimedia BroadBand Services
UR - http://www.scopus.com/inward/record.url?scp=85103472460&partnerID=8YFLogxK
U2 - 10.1109/BMSB49480.2020.9379443
DO - 10.1109/BMSB49480.2020.9379443
M3 - Conference contribution
AN - SCOPUS:85103472460
T3 - IEEE International Symposium on Broadband Multimedia Systems and Broadcasting, BMSB
BT - 15th IEEE International Symposium on Broadband Multimedia Systems and Broadcasting, BMSB 2020
PB - IEEE
T2 - 15th IEEE International Symposium on Broadband Multimedia Systems and Broadcasting, BMSB 2020
Y2 - 27 October 2020 through 29 October 2020
ER -