Incorporating AR into a Multimodal UI for an Artificial Pancreas: The interdisciplinary nature of integrating augmented reality (AR), sound, and touch into a user interface (UI) for diabetes patients with embedded control systems for an artificial pancreas (AP)
{"title":"Incorporating AR into a Multimodal UI for an Artificial Pancreas: The interdisciplinary nature of integrating augmented reality (AR), sound, and touch into a user interface (UI) for diabetes patients with embedded control systems for an artificial pancreas (AP)","authors":"Rick Mott","doi":"10.1145/3233756.3233932","DOIUrl":null,"url":null,"abstract":"This paper explores the emerging field of human-embedded medical technology and its accompanying need for innovative user experience (UX) design. Specifically, the paper argues that, in the medium term, augmented reality (AR) can effectively serve as one part of a multimodal user interface (UI) for diabetes patients with embedded control systems for an artificial pancreas (AP). The paper explores what types of cross-disciplinary information UX designers will need in order to incorporate AR into an effective multimodal interface for diabetes patients with an AP. Currently, researchers are developing methods to embed continuous glucose monitors (CGM), model predictive control (MPC) systems, and AP systems into diabetes patients to regulate their blood sugar levels [30]. Ideally, an embedded control in a wearable medical device would remove the need for users to interact with the system because the system would self-regulate. However, once embedded, not only will a physician/technician need to initialize and adjust the system, but, ethically and practically, patients will also need access to the system. In the medium-term future (5-10 years), AR-with its emphasis on cross-disciplinary design needs-shows promise for conveying visual information most effectively to the patient. This paper addresses the interdisciplinary nature of conveying user information for embedded medical devices because it demonstrates the need for UX designers to integrate recent advancements in healthcare research, medical technology, cross-disciplinary design theory (e.g. human factors and effective use theory), and the rapidly changing nature of human-computer interaction (HCI).","PeriodicalId":153529,"journal":{"name":"Proceedings of the 36th ACM International Conference on the Design of Communication","volume":"115 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-08-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 36th ACM International Conference on the Design of Communication","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3233756.3233932","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
This paper explores the emerging field of human-embedded medical technology and its accompanying need for innovative user experience (UX) design. Specifically, the paper argues that, in the medium term, augmented reality (AR) can effectively serve as one part of a multimodal user interface (UI) for diabetes patients with embedded control systems for an artificial pancreas (AP). The paper explores what types of cross-disciplinary information UX designers will need in order to incorporate AR into an effective multimodal interface for diabetes patients with an AP. Currently, researchers are developing methods to embed continuous glucose monitors (CGM), model predictive control (MPC) systems, and AP systems into diabetes patients to regulate their blood sugar levels [30]. Ideally, an embedded control in a wearable medical device would remove the need for users to interact with the system because the system would self-regulate. However, once embedded, not only will a physician/technician need to initialize and adjust the system, but, ethically and practically, patients will also need access to the system. In the medium-term future (5-10 years), AR-with its emphasis on cross-disciplinary design needs-shows promise for conveying visual information most effectively to the patient. This paper addresses the interdisciplinary nature of conveying user information for embedded medical devices because it demonstrates the need for UX designers to integrate recent advancements in healthcare research, medical technology, cross-disciplinary design theory (e.g. human factors and effective use theory), and the rapidly changing nature of human-computer interaction (HCI).