{"title":"使用能量采集传感器控制采样","authors":"A. Seyedi","doi":"10.1109/ITA.2014.6804220","DOIUrl":null,"url":null,"abstract":"The problem of sampling from a remote sensor, powered by energy harvesting, is considered. The problem is formulated as a partially observable Markov decision process (POMDP), since the controller only has partial knowledge of the energy reserve at the sensor. Three policies are proposed and their performances are evaluated and compared to that of a clairvoyant policy.","PeriodicalId":338302,"journal":{"name":"2014 Information Theory and Applications Workshop (ITA)","volume":"1996 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Controlled sampling using an energy harvesting sensor\",\"authors\":\"A. Seyedi\",\"doi\":\"10.1109/ITA.2014.6804220\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The problem of sampling from a remote sensor, powered by energy harvesting, is considered. The problem is formulated as a partially observable Markov decision process (POMDP), since the controller only has partial knowledge of the energy reserve at the sensor. Three policies are proposed and their performances are evaluated and compared to that of a clairvoyant policy.\",\"PeriodicalId\":338302,\"journal\":{\"name\":\"2014 Information Theory and Applications Workshop (ITA)\",\"volume\":\"1996 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-02-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2014 Information Theory and Applications Workshop (ITA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ITA.2014.6804220\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 Information Theory and Applications Workshop (ITA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITA.2014.6804220","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Controlled sampling using an energy harvesting sensor
The problem of sampling from a remote sensor, powered by energy harvesting, is considered. The problem is formulated as a partially observable Markov decision process (POMDP), since the controller only has partial knowledge of the energy reserve at the sensor. Three policies are proposed and their performances are evaluated and compared to that of a clairvoyant policy.