Incorporating AR into a Multimodal UI for an Artificial Pancreas: The interdisciplinary nature of integrating augmented reality (AR), sound, and touch into a user interface (UI) for diabetes patients with embedded control systems for an artificial pancreas (AP)

Rick Mott
{"title":"Incorporating AR into a Multimodal UI for an Artificial Pancreas: The interdisciplinary nature of integrating augmented reality (AR), sound, and touch into a user interface (UI) for diabetes patients with embedded control systems for an artificial pancreas (AP)","authors":"Rick Mott","doi":"10.1145/3233756.3233932","DOIUrl":null,"url":null,"abstract":"This paper explores the emerging field of human-embedded medical technology and its accompanying need for innovative user experience (UX) design. Specifically, the paper argues that, in the medium term, augmented reality (AR) can effectively serve as one part of a multimodal user interface (UI) for diabetes patients with embedded control systems for an artificial pancreas (AP). The paper explores what types of cross-disciplinary information UX designers will need in order to incorporate AR into an effective multimodal interface for diabetes patients with an AP. Currently, researchers are developing methods to embed continuous glucose monitors (CGM), model predictive control (MPC) systems, and AP systems into diabetes patients to regulate their blood sugar levels [30]. Ideally, an embedded control in a wearable medical device would remove the need for users to interact with the system because the system would self-regulate. However, once embedded, not only will a physician/technician need to initialize and adjust the system, but, ethically and practically, patients will also need access to the system. In the medium-term future (5-10 years), AR-with its emphasis on cross-disciplinary design needs-shows promise for conveying visual information most effectively to the patient. This paper addresses the interdisciplinary nature of conveying user information for embedded medical devices because it demonstrates the need for UX designers to integrate recent advancements in healthcare research, medical technology, cross-disciplinary design theory (e.g. human factors and effective use theory), and the rapidly changing nature of human-computer interaction (HCI).","PeriodicalId":153529,"journal":{"name":"Proceedings of the 36th ACM International Conference on the Design of Communication","volume":"115 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-08-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 36th ACM International Conference on the Design of Communication","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3233756.3233932","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

This paper explores the emerging field of human-embedded medical technology and its accompanying need for innovative user experience (UX) design. Specifically, the paper argues that, in the medium term, augmented reality (AR) can effectively serve as one part of a multimodal user interface (UI) for diabetes patients with embedded control systems for an artificial pancreas (AP). The paper explores what types of cross-disciplinary information UX designers will need in order to incorporate AR into an effective multimodal interface for diabetes patients with an AP. Currently, researchers are developing methods to embed continuous glucose monitors (CGM), model predictive control (MPC) systems, and AP systems into diabetes patients to regulate their blood sugar levels [30]. Ideally, an embedded control in a wearable medical device would remove the need for users to interact with the system because the system would self-regulate. However, once embedded, not only will a physician/technician need to initialize and adjust the system, but, ethically and practically, patients will also need access to the system. In the medium-term future (5-10 years), AR-with its emphasis on cross-disciplinary design needs-shows promise for conveying visual information most effectively to the patient. This paper addresses the interdisciplinary nature of conveying user information for embedded medical devices because it demonstrates the need for UX designers to integrate recent advancements in healthcare research, medical technology, cross-disciplinary design theory (e.g. human factors and effective use theory), and the rapidly changing nature of human-computer interaction (HCI).
将AR集成到人工胰腺的多模态UI中:将增强现实(AR),声音和触摸集成到具有嵌入式人工胰腺控制系统的糖尿病患者的用户界面(UI)中的跨学科性质
本文探讨了人类嵌入式医疗技术的新兴领域及其对创新用户体验(UX)设计的需求。具体来说,该论文认为,在中期,增强现实(AR)可以有效地作为糖尿病患者的多模式用户界面(UI)的一部分,为人工胰腺(AP)嵌入控制系统。本文探讨了用户体验设计师需要哪些类型的跨学科信息,以便将AR整合到具有AP的糖尿病患者的有效多模式界面中。目前,研究人员正在开发将连续血糖监测仪(CGM)、模型预测控制(MPC)系统和AP系统嵌入糖尿病患者以调节其血糖水平的方法。理想情况下,可穿戴医疗设备中的嵌入式控制将消除用户与系统交互的需要,因为系统可以自我调节。然而,一旦嵌入,不仅医生/技术人员需要初始化和调整系统,而且,从伦理和实践上讲,患者也需要访问系统。在中期(5-10年),强调跨学科设计需求的ar有望最有效地向患者传达视觉信息。本文阐述了嵌入式医疗设备传达用户信息的跨学科性质,因为它展示了用户体验设计师需要整合医疗保健研究、医疗技术、跨学科设计理论(例如人为因素和有效使用理论)的最新进展,以及人机交互(HCI)快速变化的性质。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信