你没看见吗?:朝着基于注视的人机交互适应方向发展

Marcel Walch, David Lehr, Mark Colley, M. Weber
{"title":"你没看见吗?:朝着基于注视的人机交互适应方向发展","authors":"Marcel Walch, David Lehr, Mark Colley, M. Weber","doi":"10.1145/3349263.3351338","DOIUrl":null,"url":null,"abstract":"Highly automated driving evolves steadily and even gradually enters public roads. Nevertheless, there remain driving-related tasks that can be handled more efficiently by humans. Cooperation with the human user on a higher abstraction level of the dynamic driving task has been suggested to overcome operational boundaries. This cooperation includes for example deciding whether pedestrians want to cross the road ahead. We suggest that systems should monitor their users when they have to make such decisions. Moreover, these systems can adapt the interaction to support their users. In particular, they can match gaze direction and objects in their environmental model like vulnerable road users to guide the focus of users towards overlooked objects. We conducted a pilot study to investigate the need and feasibility of this concept. Our preliminary analysis showed that some participants overlooked pedestrians that intended to cross the road which could be prevented with such systems.","PeriodicalId":237150,"journal":{"name":"Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications: Adjunct Proceedings","volume":"12 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"Don't you see them?: towards gaze-based interaction adaptation for driver-vehicle cooperation\",\"authors\":\"Marcel Walch, David Lehr, Mark Colley, M. Weber\",\"doi\":\"10.1145/3349263.3351338\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Highly automated driving evolves steadily and even gradually enters public roads. Nevertheless, there remain driving-related tasks that can be handled more efficiently by humans. Cooperation with the human user on a higher abstraction level of the dynamic driving task has been suggested to overcome operational boundaries. This cooperation includes for example deciding whether pedestrians want to cross the road ahead. We suggest that systems should monitor their users when they have to make such decisions. Moreover, these systems can adapt the interaction to support their users. In particular, they can match gaze direction and objects in their environmental model like vulnerable road users to guide the focus of users towards overlooked objects. We conducted a pilot study to investigate the need and feasibility of this concept. Our preliminary analysis showed that some participants overlooked pedestrians that intended to cross the road which could be prevented with such systems.\",\"PeriodicalId\":237150,\"journal\":{\"name\":\"Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications: Adjunct Proceedings\",\"volume\":\"12 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-09-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications: Adjunct Proceedings\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3349263.3351338\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications: Adjunct Proceedings","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3349263.3351338","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 10

摘要

高度自动驾驶稳步发展,甚至逐渐进入公共道路。然而,仍然有一些与驾驶相关的任务可以由人类更有效地处理。有人建议在动态驾驶任务的更高抽象层次上与人类用户合作,以克服操作边界。这种合作包括决定行人是否想要穿过前方的道路。我们建议系统应该在用户必须做出此类决定时对其进行监控。此外,这些系统可以调整交互以支持其用户。特别是,他们可以匹配凝视方向和环境模型中的物体,比如脆弱的道路使用者,引导用户的注意力转向被忽视的物体。我们进行了一项试点研究,以调查这一概念的必要性和可行性。我们的初步分析显示,一些参与者忽视了打算过马路的行人,而这些行人可以通过这种系统来防止。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Don't you see them?: towards gaze-based interaction adaptation for driver-vehicle cooperation
Highly automated driving evolves steadily and even gradually enters public roads. Nevertheless, there remain driving-related tasks that can be handled more efficiently by humans. Cooperation with the human user on a higher abstraction level of the dynamic driving task has been suggested to overcome operational boundaries. This cooperation includes for example deciding whether pedestrians want to cross the road ahead. We suggest that systems should monitor their users when they have to make such decisions. Moreover, these systems can adapt the interaction to support their users. In particular, they can match gaze direction and objects in their environmental model like vulnerable road users to guide the focus of users towards overlooked objects. We conducted a pilot study to investigate the need and feasibility of this concept. Our preliminary analysis showed that some participants overlooked pedestrians that intended to cross the road which could be prevented with such systems.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信