{"title":"基于机器学习和LoRa的医疗保健模型的性能分析和比较。","authors":"Navneet Verma, Sukhdip Singh, Devendra Prasad","doi":"10.1007/s00521-023-08411-5","DOIUrl":null,"url":null,"abstract":"<p><p>Diabetes Mellitus (DM) is a widespread condition that is one of the main causes of health disasters around the world, and health monitoring is one of the sustainable development topics. Currently, the Internet of Things (IoT) and Machine Learning (ML) technologies work together to provide a reliable method of monitoring and predicting Diabetes Mellitus. In this paper, we present the performance of a model for patient real-time data collection that employs the Hybrid Enhanced Adaptive Data Rate (HEADR) algorithm for the Long-Range (LoRa) protocol of the IoT. On the Contiki Cooja simulator, the LoRa protocol's performance is measured in terms of high dissemination and dynamic data transmission range allocation. Furthermore, by employing classification methods for the detection of diabetes severity levels on acquired data via the LoRa (HEADR) protocol, Machine Learning prediction takes place. For prediction, a variety of Machine Learning classifiers are employed, and the final results are compared with the already existing models where the Random Forest and Decision Tree classifiers outperform the others in terms of precision, recall, <i>F</i>-measure, and receiver operating curve (ROC) in the Python programming language. We also discovered that using <i>k</i>-fold cross-validation on <i>k</i>-neighbors, Logistic regression (LR), and Gaussian Nave Bayes (GNB) classifiers boosted the accuracy.</p>","PeriodicalId":49766,"journal":{"name":"Neural Computing & Applications","volume":"35 17","pages":"12751-12761"},"PeriodicalIF":4.5000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9989556/pdf/","citationCount":"0","resultStr":"{\"title\":\"Performance analysis and comparison of Machine Learning and LoRa-based Healthcare model.\",\"authors\":\"Navneet Verma, Sukhdip Singh, Devendra Prasad\",\"doi\":\"10.1007/s00521-023-08411-5\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Diabetes Mellitus (DM) is a widespread condition that is one of the main causes of health disasters around the world, and health monitoring is one of the sustainable development topics. Currently, the Internet of Things (IoT) and Machine Learning (ML) technologies work together to provide a reliable method of monitoring and predicting Diabetes Mellitus. In this paper, we present the performance of a model for patient real-time data collection that employs the Hybrid Enhanced Adaptive Data Rate (HEADR) algorithm for the Long-Range (LoRa) protocol of the IoT. On the Contiki Cooja simulator, the LoRa protocol's performance is measured in terms of high dissemination and dynamic data transmission range allocation. Furthermore, by employing classification methods for the detection of diabetes severity levels on acquired data via the LoRa (HEADR) protocol, Machine Learning prediction takes place. For prediction, a variety of Machine Learning classifiers are employed, and the final results are compared with the already existing models where the Random Forest and Decision Tree classifiers outperform the others in terms of precision, recall, <i>F</i>-measure, and receiver operating curve (ROC) in the Python programming language. We also discovered that using <i>k</i>-fold cross-validation on <i>k</i>-neighbors, Logistic regression (LR), and Gaussian Nave Bayes (GNB) classifiers boosted the accuracy.</p>\",\"PeriodicalId\":49766,\"journal\":{\"name\":\"Neural Computing & Applications\",\"volume\":\"35 17\",\"pages\":\"12751-12761\"},\"PeriodicalIF\":4.5000,\"publicationDate\":\"2023-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9989556/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Computing & Applications\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1007/s00521-023-08411-5\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2023/3/7 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Computing & Applications","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s00521-023-08411-5","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2023/3/7 0:00:00","PubModel":"Epub","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Performance analysis and comparison of Machine Learning and LoRa-based Healthcare model.
Diabetes Mellitus (DM) is a widespread condition that is one of the main causes of health disasters around the world, and health monitoring is one of the sustainable development topics. Currently, the Internet of Things (IoT) and Machine Learning (ML) technologies work together to provide a reliable method of monitoring and predicting Diabetes Mellitus. In this paper, we present the performance of a model for patient real-time data collection that employs the Hybrid Enhanced Adaptive Data Rate (HEADR) algorithm for the Long-Range (LoRa) protocol of the IoT. On the Contiki Cooja simulator, the LoRa protocol's performance is measured in terms of high dissemination and dynamic data transmission range allocation. Furthermore, by employing classification methods for the detection of diabetes severity levels on acquired data via the LoRa (HEADR) protocol, Machine Learning prediction takes place. For prediction, a variety of Machine Learning classifiers are employed, and the final results are compared with the already existing models where the Random Forest and Decision Tree classifiers outperform the others in terms of precision, recall, F-measure, and receiver operating curve (ROC) in the Python programming language. We also discovered that using k-fold cross-validation on k-neighbors, Logistic regression (LR), and Gaussian Nave Bayes (GNB) classifiers boosted the accuracy.
期刊介绍:
Neural Computing & Applications is an international journal which publishes original research and other information in the field of practical applications of neural computing and related techniques such as genetic algorithms, fuzzy logic and neuro-fuzzy systems.
All items relevant to building practical systems are within its scope, including but not limited to:
-adaptive computing-
algorithms-
applicable neural networks theory-
applied statistics-
architectures-
artificial intelligence-
benchmarks-
case histories of innovative applications-
fuzzy logic-
genetic algorithms-
hardware implementations-
hybrid intelligent systems-
intelligent agents-
intelligent control systems-
intelligent diagnostics-
intelligent forecasting-
machine learning-
neural networks-
neuro-fuzzy systems-
pattern recognition-
performance measures-
self-learning systems-
software simulations-
supervised and unsupervised learning methods-
system engineering and integration.
Featured contributions fall into several categories: Original Articles, Review Articles, Book Reviews and Announcements.