Xiaotian Zhang , Weiping He , Yunfei Qin , Mark Billinghurst , Jiepeng Dong , Daisong Liu , Jilong Bai , Zenglei Wang
{"title":"比较视觉和多感官增强现实技术在精确手动操作任务中的应用","authors":"Xiaotian Zhang , Weiping He , Yunfei Qin , Mark Billinghurst , Jiepeng Dong , Daisong Liu , Jilong Bai , Zenglei Wang","doi":"10.1016/j.displa.2024.102768","DOIUrl":null,"url":null,"abstract":"<div><p>Precise manual manipulation is an important skill in daily life, and Augmented Reality (AR) is increasingly being used to support such operations. This article reports on a study investigating the usability of visual and multisensory AR for precise manual manipulation tasks, in particular the representation of detailed deviations from the target pose. Two AR instruction interfaces were developed: the visual deviation instruction and the multisensory deviation instruction. Both interfaces used visual cues to indicate the required directions for manipulation. The difference was that the visual deviation instruction used text and color mapping to represent deviations, whereas the multisensory deviation instruction used sonification and vibration to represent deviations. A user study was conducted with 16 participants to compare the two interfaces. The results found a significant difference only in speed, without significant differences in accuracy, perceived ease-of-use, workload, or custom user experience elements. Multisensory deviation cues can speed up precise manual manipulation compared to visual deviation cues, but inappropriate sonification and vibration strategies can negatively affect users’ subjective experience, offsetting the benefits of multisensory AR. Based on the results, several recommendations were provided for designing AR instruction interfaces to support precise manual manipulation.</p></div>","PeriodicalId":50570,"journal":{"name":"Displays","volume":"84 ","pages":"Article 102768"},"PeriodicalIF":3.7000,"publicationDate":"2024-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Comparison of visual and multisensory augmented reality for precise manual manipulation tasks\",\"authors\":\"Xiaotian Zhang , Weiping He , Yunfei Qin , Mark Billinghurst , Jiepeng Dong , Daisong Liu , Jilong Bai , Zenglei Wang\",\"doi\":\"10.1016/j.displa.2024.102768\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Precise manual manipulation is an important skill in daily life, and Augmented Reality (AR) is increasingly being used to support such operations. This article reports on a study investigating the usability of visual and multisensory AR for precise manual manipulation tasks, in particular the representation of detailed deviations from the target pose. Two AR instruction interfaces were developed: the visual deviation instruction and the multisensory deviation instruction. Both interfaces used visual cues to indicate the required directions for manipulation. The difference was that the visual deviation instruction used text and color mapping to represent deviations, whereas the multisensory deviation instruction used sonification and vibration to represent deviations. A user study was conducted with 16 participants to compare the two interfaces. The results found a significant difference only in speed, without significant differences in accuracy, perceived ease-of-use, workload, or custom user experience elements. Multisensory deviation cues can speed up precise manual manipulation compared to visual deviation cues, but inappropriate sonification and vibration strategies can negatively affect users’ subjective experience, offsetting the benefits of multisensory AR. Based on the results, several recommendations were provided for designing AR instruction interfaces to support precise manual manipulation.</p></div>\",\"PeriodicalId\":50570,\"journal\":{\"name\":\"Displays\",\"volume\":\"84 \",\"pages\":\"Article 102768\"},\"PeriodicalIF\":3.7000,\"publicationDate\":\"2024-06-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Displays\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S014193822400132X\",\"RegionNum\":2,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Displays","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S014193822400132X","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE","Score":null,"Total":0}
引用次数: 0
摘要
精确的手动操作是日常生活中的一项重要技能,而增强现实(AR)正越来越多地被用于支持此类操作。本文报告了一项研究,该研究调查了视觉和多感官 AR 在精确手动操作任务中的可用性,特别是在表示与目标姿势的详细偏差方面。研究人员开发了两种 AR 指令界面:视觉偏差指令和多感官偏差指令。这两种界面都使用视觉线索来指示所需的操作方向。不同之处在于,视觉偏差指令使用文字和颜色映射来表示偏差,而多感官偏差指令则使用声音和振动来表示偏差。我们对 16 名参与者进行了用户研究,以比较这两种界面。结果发现,两种界面仅在速度上存在显著差异,而在准确性、易用性、工作量或自定义用户体验元素方面没有显著差异。与视觉偏差提示相比,多感官偏差提示可以加快精确手动操作的速度,但不恰当的声化和振动策略会对用户的主观体验产生负面影响,从而抵消多感官 AR 带来的好处。根据研究结果,为设计支持精确手动操作的AR教学界面提出了若干建议。
Comparison of visual and multisensory augmented reality for precise manual manipulation tasks
Precise manual manipulation is an important skill in daily life, and Augmented Reality (AR) is increasingly being used to support such operations. This article reports on a study investigating the usability of visual and multisensory AR for precise manual manipulation tasks, in particular the representation of detailed deviations from the target pose. Two AR instruction interfaces were developed: the visual deviation instruction and the multisensory deviation instruction. Both interfaces used visual cues to indicate the required directions for manipulation. The difference was that the visual deviation instruction used text and color mapping to represent deviations, whereas the multisensory deviation instruction used sonification and vibration to represent deviations. A user study was conducted with 16 participants to compare the two interfaces. The results found a significant difference only in speed, without significant differences in accuracy, perceived ease-of-use, workload, or custom user experience elements. Multisensory deviation cues can speed up precise manual manipulation compared to visual deviation cues, but inappropriate sonification and vibration strategies can negatively affect users’ subjective experience, offsetting the benefits of multisensory AR. Based on the results, several recommendations were provided for designing AR instruction interfaces to support precise manual manipulation.
期刊介绍:
Displays is the international journal covering the research and development of display technology, its effective presentation and perception of information, and applications and systems including display-human interface.
Technical papers on practical developments in Displays technology provide an effective channel to promote greater understanding and cross-fertilization across the diverse disciplines of the Displays community. Original research papers solving ergonomics issues at the display-human interface advance effective presentation of information. Tutorial papers covering fundamentals intended for display technologies and human factor engineers new to the field will also occasionally featured.