{"title":"Challenges for Creating Driver Overriding Mechanisms","authors":"Steffen Maurer, E. Rukzio, Rainer Erbach","doi":"10.1145/3131726.3131764","DOIUrl":"https://doi.org/10.1145/3131726.3131764","url":null,"abstract":"This work-in-progress paper describes the challenges and needed research for creating a system that checks the driver's actions for plausibility, and, if failing this check, counteracts or ignores them. After a brief introduction why such a system might help to avoid accidents, a taxonomy of different (driver) assistive systems is introduced and shows the current lack of a similar system in the automotive context. In the last part the needed research and research challenges, regarding the design of such a system, are presented.","PeriodicalId":288342,"journal":{"name":"Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125243962","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Multimodal Heads Up Displays to Augment Autonomous Vehicle Supervision","authors":"Keenan R. May, Brittany E. Noah, B. Walker","doi":"10.1145/3131726.3131877","DOIUrl":"https://doi.org/10.1145/3131726.3131877","url":null,"abstract":"Drivers using SAE level 2-4 systems are required to supervise the vehicle, and may need to take control when certain conditions arise. While awareness of general automation certainty is crucial, the attention of the supervisory driver could also be directed toward specific areas or objects that the automated system is uncertain about. This video is a mockup of a system that uses a combination of audio and heads-up-display elements to inform the driver of specific areas of uncertainty and allow them to decide whether to take control. This project will continue via participatory design activities followed by simulator research.","PeriodicalId":288342,"journal":{"name":"Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct","volume":"170 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122804688","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Force-enabled Touch Input on the Steering Wheel: An Elicitation Study","authors":"Jochen Huber, Mohamed A. Sheik-Nainar, N. Matic","doi":"10.1145/3131726.3131740","DOIUrl":"https://doi.org/10.1145/3131726.3131740","url":null,"abstract":"In this paper, we contribute to the growing effort in the community to standardize the in-car interaction space. We present an interaction language for steering wheel interfaces with force-enabled touch input. Based on an elicitation study, the language maps core force interactions to common in-car commands. The results also shed light onto mental models of force touch interaction on the steering wheel and provide guidelines for future interfaces.","PeriodicalId":288342,"journal":{"name":"Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct","volume":"190 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114393979","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Control Transferring between Automated and Manual Driving using Shared Control","authors":"Takahiro Saito, T. Wada, Kohei Sonoda","doi":"10.1145/3131726.3131753","DOIUrl":"https://doi.org/10.1145/3131726.3131753","url":null,"abstract":"In this research, we propose a method to connect automated and manual driving by haptic shared control (HSC) to achieve shared authority mode and authority transfer method in this mode. Driving simulator experiments showed that the smoother steering behaviors were observed with the proposed method even when rapid steering was required just after taking the control by human drivers.","PeriodicalId":288342,"journal":{"name":"Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126381886","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Brittany E. Noah, Philipp Wintersberger, Alexander G. Mirnig, Shailie Thakkar, Fei-fei Yan, Thomas M. Gable, J. Kraus, Rod McCall
{"title":"First Workshop on Trust in the Age of Automated Driving","authors":"Brittany E. Noah, Philipp Wintersberger, Alexander G. Mirnig, Shailie Thakkar, Fei-fei Yan, Thomas M. Gable, J. Kraus, Rod McCall","doi":"10.1145/3131726.3131733","DOIUrl":"https://doi.org/10.1145/3131726.3131733","url":null,"abstract":"This workshop intends to address contemporary issues surrounding trust in technology in the challenging and constantly changing context of automated vehicles. In particular, this workshop focuses on two main aspects: (1) appropriate definitions of trust and associated concepts for the automated driving context, especially regarding trust calibration in individual capabilities versus overall trust; (2) appropriate measures (qualitative and quantitative) to quantify trust in automated vehicles and in-vehicle interfaces. The workshop proceeds on the basis of a keynote and accepted position papers by participants as a basis for the focused breakout sessions. The outcome of the workshop will become the basis for a subsequent joint publication of organizers and participants discussing the issues (1) and (2).","PeriodicalId":288342,"journal":{"name":"Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128093883","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Tutorial: How Does Your HMI Design Affect the Visual Attention of the Driver","authors":"S. Feuerstack, Bertram Wortelen","doi":"10.1145/3131726.3131727","DOIUrl":"https://doi.org/10.1145/3131726.3131727","url":null,"abstract":"The consideration of driver's visual attention for Human Machine Interface (HMI) design is critical to ensure fast reaction times in unexpected situations and to promote situation awareness in hand-over situations. The effect of an HMI to the attention distribution of the driver can be measured by performing eye-tracking studies in a driving simulator. Performing eye-tracking studies requires functional HMI prototypes but give no insights on the underlying mechanisms for the measured behavior. In the tutorial we introduce a tool-driven and model-based approach to visual attention prediction, which can be performed already based on early HMI mockup ideas and with less effort compared to eye-tracking studies. The tutorial starts with an introduction to the theories of model-based visual attention prediction. Thereafter, participants are invited to either predict the visual attention for their own HMI design ideas or conduct an evaluation of an exemplary use case with the software tools that the participants can install on their computers or use in our lab.","PeriodicalId":288342,"journal":{"name":"Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132065945","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Examining the Impact of See-Through Cockpits on Driving Performance in a Mixed Reality Prototype","authors":"Patrick Lindemann, G. Rigoll","doi":"10.1145/3131726.3131754","DOIUrl":"https://doi.org/10.1145/3131726.3131754","url":null,"abstract":"We built and evaluated a see-through cockpit prototype for a driving simulation in a mixed reality environment, simulating an HMD-based interface. Advantages of such a system may include better driving performance, collision avoidance and situation awareness. Early results from driving line data indicate potential for improving lateral control by driving with transparent cockpits and show no difference regarding different levels of transparency. We extended our prototype based on the results and abandoned the head-registered interface in favor of a simulation of a projection-based system targetting specific car parts. We present the current prototype and discuss how it relates to existing proof of concepts and potential future real-world implementations. We plan to evaluate the latest prototype in a larger-scale study to determine its impact on lane-keeping performance. We also want to consider impairments of the real world and plan to evaluate the system with artificially induced head tracking errors.","PeriodicalId":288342,"journal":{"name":"Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132007288","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Eyes-free In-vehicle Gesture Controls: Auditory-only Displays Reduced Visual Distraction and Workload","authors":"Jason Sterkenburg, S. Landry, M. Jeon","doi":"10.1145/3131726.3131747","DOIUrl":"https://doi.org/10.1145/3131726.3131747","url":null,"abstract":"Visual distractions increase crash risk while driving. Our research focuses on creating and evaluating an air gesture control system that is less visually demanding than current infotainment systems. We completed a within-subjects experiment with 24 participants, each of whom completed a simulated drive while using six different prototypes, in turn. The primary research questions were the influence of combinations of visual and auditory displays (visual, visual/auditory, auditory) and control orientation (vertical vs horizontal). We recorded lane departures, eye glance behavior, secondary task performance, and driver workload. Results demonstrated that for lane departures all prototypes performed comparably, with the auditory-only showing a strong tendency of improvements. A deeper look illustrated a tradeoff between eyes-on-road time and secondary task completion time for the auditory-only display -- the safest but slowest among the six prototypes. The auditory-only also reduced overall workload. Control orientation showed only small subjective effect in favor of vertical controls.","PeriodicalId":288342,"journal":{"name":"Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128584801","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Matej Vengust, Bostjan Kaluza, Kristina Stojmenova, J. Sodnik
{"title":"NERVteh Compact Motion Based Driving Simulator","authors":"Matej Vengust, Bostjan Kaluza, Kristina Stojmenova, J. Sodnik","doi":"10.1145/3131726.3132047","DOIUrl":"https://doi.org/10.1145/3131726.3132047","url":null,"abstract":"NERVteh is a Slovenian high-tech R&D company, specialized in vehicle simulation and driver evaluation technologies. Its main product is a compact driving simulator based on a 4DOF motion platform and powerful simulation software. It is modular and customizable software, providing a variety of virtual environments, road and weather conditions, AI-based traffic and realistic vehicle dynamics. The simulator hardware consists mostly of real car components and offers almost real-life driving experience. The simulation can run on three large curved screens or on a VR headset. Additionally, it enables real-time communication and synchronization with a set of external sensors for assessment of drivers' physical and mental states. This technology is mostly used for driver training and education, biometrical evaluation and profiling. It is also a very efficient tool for development and testing of smart traffic infrastructure, road planning and risk assessment of drivers. This demo will provide the opportunity to the conference participants to experience first-hand the latest hardware and software technology advancements for capturing driver behavior in a simulated driving environment.","PeriodicalId":288342,"journal":{"name":"Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129630022","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Orestis Georgiou, Valerio Biscione, Adam Harwood, Daniel Griffiths, Marcello Giordano, Benjamin Long, Thomas Carter
{"title":"Haptic In-Vehicle Gesture Controls","authors":"Orestis Georgiou, Valerio Biscione, Adam Harwood, Daniel Griffiths, Marcello Giordano, Benjamin Long, Thomas Carter","doi":"10.1145/3131726.3132045","DOIUrl":"https://doi.org/10.1145/3131726.3132045","url":null,"abstract":"Recent efforts have proposed the use of hand gestures to control in-vehicle infotainment systems (IVISs) in an attempt to reduce visual demand and therefore reduce crash risk. These efforts however lack tactile feedback resulting in a decreased sense of control over the user's intended actions. Here, we describe a demo that uses commercially available novel technologies which mitigate this problem by using focused ultrasound to accurately deliver a radiation force onto the user's operating hand. This new paradigm of mid-air haptic feedback presents new opportunities and challenges towards designing effective interaction languages for IVISs. In this paper, we describe our demo prototype and how it addresses some of these challenges.","PeriodicalId":288342,"journal":{"name":"Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct","volume":"389 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122297333","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}