Multi-Agent Actor-Critic Method for Joint Duty-Cycle and Transmission Power Control

Sota Sawaguchi, J. Christmann, A. Molnos, C. Bernier, S. Lesecq
{"title":"Multi-Agent Actor-Critic Method for Joint Duty-Cycle and Transmission Power Control","authors":"Sota Sawaguchi, J. Christmann, A. Molnos, C. Bernier, S. Lesecq","doi":"10.23919/DATE48585.2020.9116518","DOIUrl":null,"url":null,"abstract":"In energy-harvesting Internet of Things (EH-IoT) wireless networks, maintaining energy neutral operation (ENO) is crucial for their perpetual operation and maintenance-free property. Guaranteeing this ENO condition and optimal power-performance trade-off under transient harvested energy and wireless channel quality is particularly challenging. This paper proposes a multi-agent actor-critic reinforcement learning for modulating both the transmitter duty-cycle and output power based on the state-of-buffer (SoB) and the state-of-charge (SoC) information as a state. Thanks to these buffers, differently from the state-of-the-art, our solution does not require any model of the wireless transceiver nor any direct measurement of both harvested energy and wireless channel quality for adapting to these uncertainties. Simulation results of a solar powered EH-IoT node using real-life outdoor solar irradiance data show that the proposed method achieves better performance without system failures throughout a year compared to the state-of-the-art that suffers some system downtime. Our approach also predicts almost no system fails during five years of operation.","PeriodicalId":289525,"journal":{"name":"2020 Design, Automation & Test in Europe Conference & Exhibition (DATE)","volume":"54 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 Design, Automation & Test in Europe Conference & Exhibition (DATE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.23919/DATE48585.2020.9116518","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

In energy-harvesting Internet of Things (EH-IoT) wireless networks, maintaining energy neutral operation (ENO) is crucial for their perpetual operation and maintenance-free property. Guaranteeing this ENO condition and optimal power-performance trade-off under transient harvested energy and wireless channel quality is particularly challenging. This paper proposes a multi-agent actor-critic reinforcement learning for modulating both the transmitter duty-cycle and output power based on the state-of-buffer (SoB) and the state-of-charge (SoC) information as a state. Thanks to these buffers, differently from the state-of-the-art, our solution does not require any model of the wireless transceiver nor any direct measurement of both harvested energy and wireless channel quality for adapting to these uncertainties. Simulation results of a solar powered EH-IoT node using real-life outdoor solar irradiance data show that the proposed method achieves better performance without system failures throughout a year compared to the state-of-the-art that suffers some system downtime. Our approach also predicts almost no system fails during five years of operation.
联合占空比与传输功率控制的多智能体actor - critical方法
在能量收集物联网(EH-IoT)无线网络中,保持能量中性运行(ENO)对于其永久运行和免维护特性至关重要。在瞬时能量收集和无线信道质量下保证这种ENO条件和最佳功率性能权衡是特别具有挑战性的。本文提出了一种基于缓冲状态(SoB)和充电状态(SoC)信息作为状态调制发射机占空比和输出功率的多智能体actor- critical强化学习方法。由于这些缓冲器,与最先进的解决方案不同,我们的解决方案不需要任何型号的无线收发器,也不需要对收集的能量和无线信道质量进行任何直接测量,以适应这些不确定性。使用真实室外太阳辐照度数据的太阳能供电EH-IoT节点的仿真结果表明,与遭受一些系统停机的最先进方法相比,所提出的方法在一年内没有系统故障的情况下实现了更好的性能。我们的方法还预测在5年的运行期间几乎没有系统故障。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信