Evaluating the Usability and Quality of a Clinical Mobile App for Assisting Physicians in Head Computed Tomography Scan Ordering: Mixed Methods Study.

IF 2.6 Q2 HEALTH CARE SCIENCES & SERVICES
JMIR Human Factors Pub Date : 2024-09-09 DOI:10.2196/55790
Zahra Meidani, Aydine Omidvar, Hossein Akbari, Fatemeh Asghari, Reza Khajouei, Zahra Nazemi, Ehsan Nabovati, Felix Holl
{"title":"Evaluating the Usability and Quality of a Clinical Mobile App for Assisting Physicians in Head Computed Tomography Scan Ordering: Mixed Methods Study.","authors":"Zahra Meidani, Aydine Omidvar, Hossein Akbari, Fatemeh Asghari, Reza Khajouei, Zahra Nazemi, Ehsan Nabovati, Felix Holl","doi":"10.2196/55790","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>Among the numerous factors contributing to health care providers' engagement with mobile apps, including user characteristics (eg, dexterity, anatomy, and attitude) and mobile features (eg, screen and button size), usability and quality of apps have been introduced as the most influential factors.</p><p><strong>Objective: </strong>This study aims to investigate the usability and quality of the Head Computed Tomography Scan Appropriateness Criteria (HAC) mobile app for physicians' computed tomography scan ordering.</p><p><strong>Methods: </strong>Our study design was primarily based on methodological triangulation by using mixed methods research involving quantitative and qualitative think-aloud usability testing, quantitative analysis of the Mobile Apps Rating Scale (MARS) for quality assessment, and debriefing across 3 phases. In total, 16 medical interns participated in quality assessment and testing usability characteristics, including efficiency, effectiveness, learnability, errors, and satisfaction with the HAC app.</p><p><strong>Results: </strong>The efficiency and effectiveness of the HAC app were deemed satisfactory, with ratings of 97.8% and 96.9%, respectively. MARS assessment scale indicated the overall favorable quality score of the HAC app (82 out of 100). Scoring 4 MARS subscales, Information (73.37 out of 100) and Engagement (73.48 out of 100) had the lowest scores, while Aesthetics had the highest score (87.86 out of 100). Analysis of the items in each MARS subscale revealed that in the Engagement subscale, the lowest score of the HAC app was \"customization\" (63.6 out of 100). In the Functionality subscale, the HAC app's lowest value was \"performance\" (67.4 out of 100). Qualitative think-aloud usability testing of the HAC app found notable usability issues grouped into 8 main categories: lack of finger-friendly touch targets, poor search capabilities, input problems, inefficient data presentation and information control, unclear control and confirmation, lack of predictive capabilities, poor assistance and support, and unclear navigation logic.</p><p><strong>Conclusions: </strong>Evaluating the quality and usability of mobile apps using a mixed methods approach provides valuable information about their functionality and disadvantages. It is highly recommended to embrace a more holistic and mixed methods strategy when evaluating mobile apps, because results from a single method imperfectly reflect trustworthy and reliable information regarding the usability and quality of apps.</p>","PeriodicalId":36351,"journal":{"name":"JMIR Human Factors","volume":"11 ","pages":"e55790"},"PeriodicalIF":2.6000,"publicationDate":"2024-09-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11420597/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"JMIR Human Factors","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2196/55790","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"HEALTH CARE SCIENCES & SERVICES","Score":null,"Total":0}
引用次数: 0

Abstract

Background: Among the numerous factors contributing to health care providers' engagement with mobile apps, including user characteristics (eg, dexterity, anatomy, and attitude) and mobile features (eg, screen and button size), usability and quality of apps have been introduced as the most influential factors.

Objective: This study aims to investigate the usability and quality of the Head Computed Tomography Scan Appropriateness Criteria (HAC) mobile app for physicians' computed tomography scan ordering.

Methods: Our study design was primarily based on methodological triangulation by using mixed methods research involving quantitative and qualitative think-aloud usability testing, quantitative analysis of the Mobile Apps Rating Scale (MARS) for quality assessment, and debriefing across 3 phases. In total, 16 medical interns participated in quality assessment and testing usability characteristics, including efficiency, effectiveness, learnability, errors, and satisfaction with the HAC app.

Results: The efficiency and effectiveness of the HAC app were deemed satisfactory, with ratings of 97.8% and 96.9%, respectively. MARS assessment scale indicated the overall favorable quality score of the HAC app (82 out of 100). Scoring 4 MARS subscales, Information (73.37 out of 100) and Engagement (73.48 out of 100) had the lowest scores, while Aesthetics had the highest score (87.86 out of 100). Analysis of the items in each MARS subscale revealed that in the Engagement subscale, the lowest score of the HAC app was "customization" (63.6 out of 100). In the Functionality subscale, the HAC app's lowest value was "performance" (67.4 out of 100). Qualitative think-aloud usability testing of the HAC app found notable usability issues grouped into 8 main categories: lack of finger-friendly touch targets, poor search capabilities, input problems, inefficient data presentation and information control, unclear control and confirmation, lack of predictive capabilities, poor assistance and support, and unclear navigation logic.

Conclusions: Evaluating the quality and usability of mobile apps using a mixed methods approach provides valuable information about their functionality and disadvantages. It is highly recommended to embrace a more holistic and mixed methods strategy when evaluating mobile apps, because results from a single method imperfectly reflect trustworthy and reliable information regarding the usability and quality of apps.

评估用于协助医生订购头部计算机断层扫描片的临床移动应用程序的可用性和质量:混合方法研究。
背景:在导致医疗服务提供者使用移动应用程序的众多因素中,包括用户特征(如灵巧性、解剖学和态度)和移动功能(如屏幕和按钮大小),应用程序的可用性和质量被认为是最具影响力的因素:本研究旨在调查《头部计算机断层扫描适宜性标准》(HAC)移动应用程序的可用性和质量,以便医生进行计算机断层扫描下单:我们的研究设计主要基于方法学三角测量法,采用混合方法研究,包括定量和定性的思考-朗读可用性测试、用于质量评估的移动应用程序评分量表(MARS)的定量分析以及三个阶段的汇报。共有 16 名医学实习生参与了质量评估和可用性特征测试,包括 HAC 应用程序的效率、有效性、可学性、错误和满意度:结果:HAC 应用程序的效率和效果令人满意,评分分别为 97.8% 和 96.9%。MARS 评估量表显示,HAC 应用程序的总体质量得分较高(82 分,满分 100 分)。在 MARS 的 4 个分量表中,信息(73.37 分,满分 100 分)和参与(73.48 分,满分 100 分)得分最低,而美学得分最高(87.86 分,满分 100 分)。对 MARS 各分量表项目的分析表明,在 "参与 "分量表中,HAC 应用程序得分最低的是 "定制"(63.6 分,满分 100 分)。在功能性分量表中,HAC 应用程序的最低值是 "性能"(67.4 分,满分 100 分)。对 HAC 应用程序的定性思考-朗读可用性测试发现了明显的可用性问题,主要分为 8 个类别:缺乏手指友好触摸目标、搜索能力差、输入问题、数据展示和信息控制效率低、控制和确认不明确、缺乏预测能力、辅助和支持差以及导航逻辑不清晰:结论:使用混合方法评估移动应用程序的质量和可用性可提供有关其功能和缺点的宝贵信息。强烈建议在评估移动应用程序时采用更全面的混合方法策略,因为单一方法得出的结果并不能完美地反映应用程序可用性和质量方面可信和可靠的信息。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
JMIR Human Factors
JMIR Human Factors Medicine-Health Informatics
CiteScore
3.40
自引率
3.70%
发文量
123
审稿时长
12 weeks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信