Self-learning adaptive power management scheme for energy-efficient IoT-MEC systems using soft actor-critic algorithm

IF 6 3区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS
Amir Masoud Rahmani , Amir Haider , Komeil Moghaddasi , Farhad Soleimanian Gharehchopogh , Khursheed Aurangzeb , Zhe Liu , Mehdi Hosseinzadeh
{"title":"Self-learning adaptive power management scheme for energy-efficient IoT-MEC systems using soft actor-critic algorithm","authors":"Amir Masoud Rahmani ,&nbsp;Amir Haider ,&nbsp;Komeil Moghaddasi ,&nbsp;Farhad Soleimanian Gharehchopogh ,&nbsp;Khursheed Aurangzeb ,&nbsp;Zhe Liu ,&nbsp;Mehdi Hosseinzadeh","doi":"10.1016/j.iot.2025.101587","DOIUrl":null,"url":null,"abstract":"<div><div>The rapid increase of Internet of Things (IoT) devices in Mobile Edge Computing (MEC) environments requires effective energy management to ensure device operation and enhance network efficiency. IoT-MEC systems face challenges such as varying task loads, dynamic environmental conditions, and limited energy resources. These factors make it challenging to design adaptive and efficient energy strategies. Traditional methods, such as static scheduling and centralized control strategies, struggle to adapt to real-time fluctuations in task loads and network conditions, resulting in inefficient energy use, higher latency, and a lack of flexibility to respond to these demands in real-time. This paper proposes a self-learning power management model using the Soft Actor-Critic (SAC) algorithm. It is deployed on IoT devices to enable localized and context-aware power management. Our model includes modules for energy monitoring, adaptive task prioritization, and a self-adjusting reinforcement-learning mechanism, which dynamically fine-tunes energy policies based on real-time device conditions, allowing each device to optimize power use independently without heavy dependence on centralized control. MEC nodes gather data on battery health, load, and network conditions to support decentralized policy adjustments. Connected devices in simulated smart homes served as the primary context for evaluation. Experimental results show that our model achieves a 45 % reduction in energy consumption in smart home environments, a 49 % improvement in battery life (compared to baseline-like Local Computing), and high adaptability in diverse scenarios compared with other methods.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"31 ","pages":"Article 101587"},"PeriodicalIF":6.0000,"publicationDate":"2025-03-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Internet of Things","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2542660525001003","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

Abstract

The rapid increase of Internet of Things (IoT) devices in Mobile Edge Computing (MEC) environments requires effective energy management to ensure device operation and enhance network efficiency. IoT-MEC systems face challenges such as varying task loads, dynamic environmental conditions, and limited energy resources. These factors make it challenging to design adaptive and efficient energy strategies. Traditional methods, such as static scheduling and centralized control strategies, struggle to adapt to real-time fluctuations in task loads and network conditions, resulting in inefficient energy use, higher latency, and a lack of flexibility to respond to these demands in real-time. This paper proposes a self-learning power management model using the Soft Actor-Critic (SAC) algorithm. It is deployed on IoT devices to enable localized and context-aware power management. Our model includes modules for energy monitoring, adaptive task prioritization, and a self-adjusting reinforcement-learning mechanism, which dynamically fine-tunes energy policies based on real-time device conditions, allowing each device to optimize power use independently without heavy dependence on centralized control. MEC nodes gather data on battery health, load, and network conditions to support decentralized policy adjustments. Connected devices in simulated smart homes served as the primary context for evaluation. Experimental results show that our model achieves a 45 % reduction in energy consumption in smart home environments, a 49 % improvement in battery life (compared to baseline-like Local Computing), and high adaptability in diverse scenarios compared with other methods.
求助全文
约1分钟内获得全文 求助全文
来源期刊
Internet of Things
Internet of Things Multiple-
CiteScore
3.60
自引率
5.10%
发文量
115
审稿时长
37 days
期刊介绍: Internet of Things; Engineering Cyber Physical Human Systems is a comprehensive journal encouraging cross collaboration between researchers, engineers and practitioners in the field of IoT & Cyber Physical Human Systems. The journal offers a unique platform to exchange scientific information on the entire breadth of technology, science, and societal applications of the IoT. The journal will place a high priority on timely publication, and provide a home for high quality. Furthermore, IOT is interested in publishing topical Special Issues on any aspect of IOT.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信