{"title":"利用人工递归神经网络LTSM技术预测2.4-72GHz频段的降雨衰减","authors":"M. Domb, G. Leshem","doi":"10.1109/ICECCE52056.2021.9514095","DOIUrl":null,"url":null,"abstract":"Free-space communication is a leading component in global communications. Its advantages relate to a broader signal spread, no wiring, and ease of engagement. However, satellite communication links suffer from arbitrary weather phenomena such as clouds, rain, snow, fog, and dust. Therefore, satellites commonly use redundant signal strength to ensure constant and continuous signal transmission, resulting in excess energy consumption, challenging the limited power capacity generated by solar energy or the fixed amount of fuel. This research proposes a Machine Learning [ML]-based model that provides a time-dependent prediction of the expected attenuation level due to rain and fog. Based on the predicted attenuation level, we calibrate the communication signal strength to save energy. We used collected data from the Genesis LEO satellite and corresponding simulated data in the range of 2.4GHz to 72GHz. We then executed the ML system, and after several adjustments for the frequencies up to 48GHz, we reached a very narrow gap between the predicted and actual attenuation levels. However, in the 72GHz frequency, we got a partial correlation.","PeriodicalId":302947,"journal":{"name":"2021 International Conference on Electrical, Communication, and Computer Engineering (ICECCE)","volume":"91 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-06-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Rain Attenuation Prediction for 2.4-72GHz using LTSM, an artificial recurrent neural network technology\",\"authors\":\"M. Domb, G. Leshem\",\"doi\":\"10.1109/ICECCE52056.2021.9514095\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Free-space communication is a leading component in global communications. Its advantages relate to a broader signal spread, no wiring, and ease of engagement. However, satellite communication links suffer from arbitrary weather phenomena such as clouds, rain, snow, fog, and dust. Therefore, satellites commonly use redundant signal strength to ensure constant and continuous signal transmission, resulting in excess energy consumption, challenging the limited power capacity generated by solar energy or the fixed amount of fuel. This research proposes a Machine Learning [ML]-based model that provides a time-dependent prediction of the expected attenuation level due to rain and fog. Based on the predicted attenuation level, we calibrate the communication signal strength to save energy. We used collected data from the Genesis LEO satellite and corresponding simulated data in the range of 2.4GHz to 72GHz. We then executed the ML system, and after several adjustments for the frequencies up to 48GHz, we reached a very narrow gap between the predicted and actual attenuation levels. However, in the 72GHz frequency, we got a partial correlation.\",\"PeriodicalId\":302947,\"journal\":{\"name\":\"2021 International Conference on Electrical, Communication, and Computer Engineering (ICECCE)\",\"volume\":\"91 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-06-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 International Conference on Electrical, Communication, and Computer Engineering (ICECCE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICECCE52056.2021.9514095\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 International Conference on Electrical, Communication, and Computer Engineering (ICECCE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICECCE52056.2021.9514095","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Rain Attenuation Prediction for 2.4-72GHz using LTSM, an artificial recurrent neural network technology
Free-space communication is a leading component in global communications. Its advantages relate to a broader signal spread, no wiring, and ease of engagement. However, satellite communication links suffer from arbitrary weather phenomena such as clouds, rain, snow, fog, and dust. Therefore, satellites commonly use redundant signal strength to ensure constant and continuous signal transmission, resulting in excess energy consumption, challenging the limited power capacity generated by solar energy or the fixed amount of fuel. This research proposes a Machine Learning [ML]-based model that provides a time-dependent prediction of the expected attenuation level due to rain and fog. Based on the predicted attenuation level, we calibrate the communication signal strength to save energy. We used collected data from the Genesis LEO satellite and corresponding simulated data in the range of 2.4GHz to 72GHz. We then executed the ML system, and after several adjustments for the frequencies up to 48GHz, we reached a very narrow gap between the predicted and actual attenuation levels. However, in the 72GHz frequency, we got a partial correlation.