{"title":"Tactical Decisions for Lane Changes or Lane Following? Development of a Study Design for Automated Driving","authors":"Johannes Ossig, Stephanie Cramer","doi":"10.1145/3409251.3411714","DOIUrl":"https://doi.org/10.1145/3409251.3411714","url":null,"abstract":"Overtaking slower vehicles on a highway usually involves lane changes. This paper examines a large number of non-automated as well as automated lane changes on the basis of two datasets. The focus is on the relationship between the velocity of the preceding vehicle being overtaken and the target velocity of the vehicle involved in changing lane and overtaking. Based on this, a study design is developed and should enable human-centered investigation of the preferred points in time for automated lane changes. In order to identify further characteristics of an automated journey that can influence the preferred lane change behavior, expert interviews were conducted, to be taken into consideration in the study design. According to this, non-driving related tasks play an essential role in the proposed driving study.","PeriodicalId":373501,"journal":{"name":"12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications","volume":"92 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124202298","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Nicole Fritz, F. Kobiela, D. Manstetten, A. Korthauer, K. Bengler
{"title":"Designing the Interaction of Highly Automated Vehicles with Cyclists in Urban Longitudinal Traffic: Relevant Use Cases and Methodical Considerations","authors":"Nicole Fritz, F. Kobiela, D. Manstetten, A. Korthauer, K. Bengler","doi":"10.1145/3409251.3411710","DOIUrl":"https://doi.org/10.1145/3409251.3411710","url":null,"abstract":"In future urban traffic, highly automated vehicles (HAVs) will have to successfully interact with vulnerable road users, such as pedestrians and cyclists. While the interaction of HAVs with crossing pedestrians is already well studied, HAV interaction concepts for the encounters with cyclists are yet to be explored. We present a project that focuses on the user-centered design of HAV driving maneuvers for interactions with cyclists travelling upfront and in the same direction in urban longitudinal traffic. This work introduces the use cases and the methodical approach to explore current cyclist-vehicle interactions in a real life setting. With this approach, we aim to derive implications for the design of future HAV interaction behavior.","PeriodicalId":373501,"journal":{"name":"12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications","volume":"121 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124935451","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ana Fiona Dalipi, Dongfang Liu, Xiaolei Guo, Victor Y. Chen, Christos Mousas
{"title":"VR-PAVIB: The Virtual Reality Pedestrian-Autonomous Vehicle Interaction Benchmark","authors":"Ana Fiona Dalipi, Dongfang Liu, Xiaolei Guo, Victor Y. Chen, Christos Mousas","doi":"10.1145/3409251.3411718","DOIUrl":"https://doi.org/10.1145/3409251.3411718","url":null,"abstract":"Autonomous vehicles (AVs) are an emerging theme for future transportation. However, research on pedestrian-AV interaction, which promotes pedestrian safety during autonomous driving, is not a well-explored domain. One challenge preventing the development of pedestrian-AV interaction research is that there is no publicly available and standardized benchmark to allow researchers to investigate how different interfaces could help pedestrians communicate with AVs. To resolve this challenge, we introduce the Virtual Reality Pedestrian-Autonomous Vehicle Interaction Benchmark (VR-PAVIB). VR-PAVIB is a standardized platform that can be used to reproduce interaction scenarios and compare results. Our benchmark provides state-of-the-art functionalities that can easily be implemented in any interaction scenario authored by a user. The VR-PAVIB can easily be used in a controlled lab space using low-cost virtual reality equipment. We have released our project code and include the automotive user interface community to extend VR-PAVIB.","PeriodicalId":373501,"journal":{"name":"12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126750968","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"”What is it?” How to Collect Urgent Utterances using a Gamification Approach","authors":"Jakob Landesberger, U. Ehrlich, W. Minker","doi":"10.1145/3409251.3411713","DOIUrl":"https://doi.org/10.1145/3409251.3411713","url":null,"abstract":"Many modern cars today have voice assistants. The problem is that current in car speech interfaces are mostly designed for certain commands in a very restricted form. In the future, these interfaces will have to deal with more complex user input like several intentions in one utterance or quick urgent insertions. Especially in rapidly changing situations like during a highly automated journey, it becomes relevant to detect urgent utterances and react accordingly. Collecting data by reproducing the conditions for the interaction in a real vehicle can by very difficult. Therefore we propose, to abstract the problem and use a gamification approach. We successfully simulated urgent situations and collected spoken utterances with the game “What is it?”.","PeriodicalId":373501,"journal":{"name":"12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications","volume":"70 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131984353","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
B. Abegaz, Eric Chan-Tin, Neil Klingensmith, G. Thiruvathukal
{"title":"Addressing Rogue Vehicles by Integrating Computer Vision, Activity Monitoring, and Contextual Information","authors":"B. Abegaz, Eric Chan-Tin, Neil Klingensmith, G. Thiruvathukal","doi":"10.1145/3409251.3411724","DOIUrl":"https://doi.org/10.1145/3409251.3411724","url":null,"abstract":"In this paper, we address the detection of rogue autonomous vehicles using an integrated approach involving computer vision, activity monitoring and contextual information. The proposed approach can be used to detect rogue autonomous vehicles using sensors installed on observer vehicles that are used to monitor and identify the behavior of other autonomous vehicles operating on the road. The safe braking distance and the safe following time are computed to identify if an autonomous vehicle is behaving properly. Our preliminary results show that there is a wide variation in both the safe following time and the safe braking distance recorded using three autonomous vehicles in a test-bed. These initial results show significant progress for the future efforts to coordinate the operation of autonomous, semi-autonomous and non-autonomous vehicles.","PeriodicalId":373501,"journal":{"name":"12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications","volume":"111 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131093800","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"“Help, Accident Ahead!”: Using Mixed Reality Environments in Automated Vehicles to Support Occupants After Passive Accident Experiences","authors":"Henrik Detjen, Stefan Geisler, Stefan Schneegass","doi":"10.1145/3409251.3411723","DOIUrl":"https://doi.org/10.1145/3409251.3411723","url":null,"abstract":"Currently, car assistant systems mainly try to prevent accidents. Increasing built-in car technology also extends the potential applications in vehicles. Future cars might have virtual windshields that augment the traffic or individual virtual assistants interacting with the user. In this paper, we explore the potential of an assistant system that helps the car’s occupants to calm down and reduce stress when they experience an accident in front of them. We present requirements from a discussion (N = 11) and derive a system design from them. Further, we test the system design in a video-based simulator study (N = 43). Our results indicate that an accident support system increases perceived control and trust and helps to calm down the user.","PeriodicalId":373501,"journal":{"name":"12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114015682","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Deike Albers, Niklas Grabbe, Dominik Janetzko, K. Bengler
{"title":"Saluton! How do you evaluate usability? – Virtual Workshop on Usability Assessments of Automated Driving Systems","authors":"Deike Albers, Niklas Grabbe, Dominik Janetzko, K. Bengler","doi":"10.1145/3409251.3411737","DOIUrl":"https://doi.org/10.1145/3409251.3411737","url":null,"abstract":"The usability of human-machine-interfaces (HMIs) for automated driving systems (ADS) gains importance with the imminent introduction of SAE L3 automated vehicles [15]. Assuming global proliferation of automated vehicles, a common understanding of usability for ADS HMIs and its application in research and industry is indispensable. In reference to ISO 9241-11 [8], this virtual workshop aims to identify potential differences in the understanding and the resulting assessment of usability. The international audience of the Automotive-UI poses an ideal setting for this purpose by bringing together academics and practitioners in the domain of automotive user-interfaces. The experimental design for an international usability study serves as an illustrative case example for the discussion. Participants learn about methods, challenges and current research on international evaluations of automotive user interfaces. The workshop's goal is to jointly derive a consensus for the theoretical and practical interpretation of the term usability in the context of HMIs for automated driving.","PeriodicalId":373501,"journal":{"name":"12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115218983","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Foresight Safety: Sharing Drivers’ State among Connected Road Users","authors":"P. Pretto, Sandra Trösterer, N. Ebinger, Nino Dum","doi":"10.1145/3409251.3411729","DOIUrl":"https://doi.org/10.1145/3409251.3411729","url":null,"abstract":"When drivers approach a potentially critical situation, they tend to glance over drivers of neighboring vehicles to gather a mutual understanding of the respective states and intentions. Then, experienced drivers can take quick decisions and prevent the onset of a danger. Yet, such a safety-effective behavior finds no equals in current automated driving, although the technologies to build a similar solution are already available. Therefore, it is important to investigate the effects of sharing drivers’ state among road users to understand the potential benefit for pre-critical situations. A networked simulators study was performed involving two drivers in a cut-in maneuver. Results indicate that when a driver is notified that the driver in the adjacent vehicle is distracted, the preferred reaction is to change lane, putting more space between the respective vehicles. Such a preventive action should therefore become the target behavior for automated vehicles capable of a human-like driving style.","PeriodicalId":373501,"journal":{"name":"12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications","volume":"59 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121448393","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jingyi Li, Ceenu George, Andrea Ngao, K. Holländer, Stefan Mayer, A. Butz
{"title":"An Exploration of Users’ Thoughts on Rear-Seat Productivity in Virtual Reality","authors":"Jingyi Li, Ceenu George, Andrea Ngao, K. Holländer, Stefan Mayer, A. Butz","doi":"10.1145/3409251.3411732","DOIUrl":"https://doi.org/10.1145/3409251.3411732","url":null,"abstract":"With current technology, mobile working has become a real trend. With wireless head-mounted displays we could soon even be using immersive working environments while commuting. However, it is unclear what such a virtual workplace will look like. In anticipation of autonomous cars, we investigate the use of VR in the rear seat of current cars. Given the limited space, how will interfaces make us productive, but also keep us aware of the essentials of our surroundings? In interviews with 11 commuters, they generally could imagine using VR in cars for working, but were concerned with their physical integrity while in VR. Two types of preferred working environments stuck out in the physical dimension and three information levels for rear-seat VR productivity emerged from our interviews: productivity, notification, and environment. We believe that the interview results and proposed information levels can inspire the UI structure of future ubiquitous productivity applications.","PeriodicalId":373501,"journal":{"name":"12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123787401","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Shabnam Haghzare, Jennifer L. Campos, Alex Mihailidis
{"title":"Towards A Framework of Detecting Mode Confusion in Automated Driving: Examples of Data from Older Drivers","authors":"Shabnam Haghzare, Jennifer L. Campos, Alex Mihailidis","doi":"10.1145/3409251.3411709","DOIUrl":"https://doi.org/10.1145/3409251.3411709","url":null,"abstract":"A driver's confusion about the dynamic operating modes of an Automated Vehicle (AV), and thereby their confusion about their driving responsibilities can compromise safety. To be able to detect drivers’ mode confusion in AVs, we expand on a previous theoretical model of mode confusion and operationalize it by first defining the possible operating modes within an AV. Consequently, using these AV modes as different classes, we then propose a classification framework that can potentially detect a driver's mode confusion by classifying the driver's perceived AV mode using measures of their gaze behavior. The potential applicability of this novel framework is demonstrated by a classification algorithm that can distinguish between drivers’ gaze behavior measures during two AV modes of fully-automated and non-automated driving with 93% average accuracy. The dataset was collected from older drivers (65+), who, due to changes in sensory and/or cognitive abilities can be more susceptible to mode confusion.","PeriodicalId":373501,"journal":{"name":"12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications","volume":"96 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122618865","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}