一款支持人工智能的智能手机应用程序,用于实时压力损伤评估。

Frontiers in Medical Technology Pub Date : 2022-09-23 eCollection Date: 2022-01-01 DOI:10.3389/fmedt.2022.905074
Chun Hon Lau, Ken Hung-On Yu, Tsz Fung Yip, Luke Yik Fung Luk, Abraham Ka Chung Wai, Tin-Yan Sit, Janet Yuen-Ha Wong, Joshua Wing Kei Ho
{"title":"一款支持人工智能的智能手机应用程序,用于实时压力损伤评估。","authors":"Chun Hon Lau,&nbsp;Ken Hung-On Yu,&nbsp;Tsz Fung Yip,&nbsp;Luke Yik Fung Luk,&nbsp;Abraham Ka Chung Wai,&nbsp;Tin-Yan Sit,&nbsp;Janet Yuen-Ha Wong,&nbsp;Joshua Wing Kei Ho","doi":"10.3389/fmedt.2022.905074","DOIUrl":null,"url":null,"abstract":"<p><p>The management of chronic wounds in the elderly such as pressure injury (also known as bedsore or pressure ulcer) is increasingly important in an ageing population. Accurate classification of the stage of pressure injury is important for wound care planning. Nonetheless, the expertise required for staging is often not available in a residential care home setting. Artificial-intelligence (AI)-based computer vision techniques have opened up opportunities to harness the inbuilt camera in modern smartphones to support pressure injury staging by nursing home carers. In this paper, we summarise the recent development of smartphone or tablet-based applications for wound assessment. Furthermore, we present a new smartphone application (app) to perform real-time detection and staging classification of pressure injury wounds using a deep learning-based object detection system, YOLOv4. Based on our validation set of 144 photos, our app obtained an overall prediction accuracy of 63.2%. The per-class prediction specificity is generally high (85.1%-100%), but have variable sensitivity: 73.3% (stage 1 vs. others), 37% (stage 2 vs. others), 76.7 (stage 3 vs. others), 70% (stage 4 vs. others), and 55.6% (unstageable vs. others). Using another independent test set, 8 out of 10 images were predicted correctly by the YOLOv4 model. When deployed in a real-life setting with two different ambient brightness levels with three different Android phone models, the prediction accuracy of the 10 test images ranges from 80 to 90%, which highlight the importance of evaluation of mobile health (mHealth) application in a simulated real-life setting. This study details the development and evaluation process and demonstrates the feasibility of applying such a real-time staging app in wound care management.</p>","PeriodicalId":12599,"journal":{"name":"Frontiers in Medical Technology","volume":" ","pages":"905074"},"PeriodicalIF":0.0000,"publicationDate":"2022-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9541137/pdf/","citationCount":"4","resultStr":"{\"title\":\"An artificial intelligence-enabled smartphone app for real-time pressure injury assessment.\",\"authors\":\"Chun Hon Lau,&nbsp;Ken Hung-On Yu,&nbsp;Tsz Fung Yip,&nbsp;Luke Yik Fung Luk,&nbsp;Abraham Ka Chung Wai,&nbsp;Tin-Yan Sit,&nbsp;Janet Yuen-Ha Wong,&nbsp;Joshua Wing Kei Ho\",\"doi\":\"10.3389/fmedt.2022.905074\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>The management of chronic wounds in the elderly such as pressure injury (also known as bedsore or pressure ulcer) is increasingly important in an ageing population. Accurate classification of the stage of pressure injury is important for wound care planning. Nonetheless, the expertise required for staging is often not available in a residential care home setting. Artificial-intelligence (AI)-based computer vision techniques have opened up opportunities to harness the inbuilt camera in modern smartphones to support pressure injury staging by nursing home carers. In this paper, we summarise the recent development of smartphone or tablet-based applications for wound assessment. Furthermore, we present a new smartphone application (app) to perform real-time detection and staging classification of pressure injury wounds using a deep learning-based object detection system, YOLOv4. Based on our validation set of 144 photos, our app obtained an overall prediction accuracy of 63.2%. The per-class prediction specificity is generally high (85.1%-100%), but have variable sensitivity: 73.3% (stage 1 vs. others), 37% (stage 2 vs. others), 76.7 (stage 3 vs. others), 70% (stage 4 vs. others), and 55.6% (unstageable vs. others). Using another independent test set, 8 out of 10 images were predicted correctly by the YOLOv4 model. When deployed in a real-life setting with two different ambient brightness levels with three different Android phone models, the prediction accuracy of the 10 test images ranges from 80 to 90%, which highlight the importance of evaluation of mobile health (mHealth) application in a simulated real-life setting. This study details the development and evaluation process and demonstrates the feasibility of applying such a real-time staging app in wound care management.</p>\",\"PeriodicalId\":12599,\"journal\":{\"name\":\"Frontiers in Medical Technology\",\"volume\":\" \",\"pages\":\"905074\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-09-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9541137/pdf/\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Frontiers in Medical Technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.3389/fmedt.2022.905074\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2022/1/1 0:00:00\",\"PubModel\":\"eCollection\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in Medical Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3389/fmedt.2022.905074","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2022/1/1 0:00:00","PubModel":"eCollection","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

摘要

老年人慢性伤口的处理,如压伤(也称为褥疮或压疮)在老龄化人口中越来越重要。准确的压伤分期对创伤护理规划具有重要意义。尽管如此,在养老院的环境中,分期所需的专业知识往往是不可用的。基于人工智能(AI)的计算机视觉技术为利用现代智能手机中的内置摄像头来支持养老院护理人员的压力损伤分期提供了机会。在本文中,我们总结了智能手机或平板电脑伤口评估应用的最新发展。此外,我们提出了一种新的智能手机应用程序(app),使用基于深度学习的目标检测系统YOLOv4对压力损伤伤口进行实时检测和分期分类。基于144张照片的验证集,我们的应用程序获得了63.2%的整体预测准确率。每个类别的预测特异性通常很高(85.1%-100%),但有不同的敏感性:73.3%(第一阶段vs.其他),37%(第二阶段vs.其他),76.7(第三阶段vs.其他),70%(第四阶段vs.其他)和55.6%(不可分期vs.其他)。使用另一个独立的测试集,YOLOv4模型正确预测了10张图像中的8张。当使用三种不同的Android手机模型在两种不同的环境亮度水平下部署在现实环境中时,10个测试图像的预测准确率在80%到90%之间,这突出了在模拟现实环境中评估移动健康(mHealth)应用程序的重要性。本研究详细介绍了该实时分期应用程序的开发和评估过程,并论证了该应用程序在伤口护理管理中的可行性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

An artificial intelligence-enabled smartphone app for real-time pressure injury assessment.

An artificial intelligence-enabled smartphone app for real-time pressure injury assessment.

An artificial intelligence-enabled smartphone app for real-time pressure injury assessment.

An artificial intelligence-enabled smartphone app for real-time pressure injury assessment.

The management of chronic wounds in the elderly such as pressure injury (also known as bedsore or pressure ulcer) is increasingly important in an ageing population. Accurate classification of the stage of pressure injury is important for wound care planning. Nonetheless, the expertise required for staging is often not available in a residential care home setting. Artificial-intelligence (AI)-based computer vision techniques have opened up opportunities to harness the inbuilt camera in modern smartphones to support pressure injury staging by nursing home carers. In this paper, we summarise the recent development of smartphone or tablet-based applications for wound assessment. Furthermore, we present a new smartphone application (app) to perform real-time detection and staging classification of pressure injury wounds using a deep learning-based object detection system, YOLOv4. Based on our validation set of 144 photos, our app obtained an overall prediction accuracy of 63.2%. The per-class prediction specificity is generally high (85.1%-100%), but have variable sensitivity: 73.3% (stage 1 vs. others), 37% (stage 2 vs. others), 76.7 (stage 3 vs. others), 70% (stage 4 vs. others), and 55.6% (unstageable vs. others). Using another independent test set, 8 out of 10 images were predicted correctly by the YOLOv4 model. When deployed in a real-life setting with two different ambient brightness levels with three different Android phone models, the prediction accuracy of the 10 test images ranges from 80 to 90%, which highlight the importance of evaluation of mobile health (mHealth) application in a simulated real-life setting. This study details the development and evaluation process and demonstrates the feasibility of applying such a real-time staging app in wound care management.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信