Henrik Detjen, Stefan Schneegass, Stefan Geisler, A. Kun, V. Sundar
{"title":"An Emergent Design Framework for Accessible and Inclusive Future Mobility","authors":"Henrik Detjen, Stefan Schneegass, Stefan Geisler, A. Kun, V. Sundar","doi":"10.1145/3543174.3546087","DOIUrl":"https://doi.org/10.1145/3543174.3546087","url":null,"abstract":"Future mobility will be highly automated, multimodal, and ubiquitous and thus have the potential to address a broader range of users. Yet non-average users with special needs are often underrepresented or simply not thought of in design processes of vehicles and mobility services, leading to exclusion from standard transportation. In consequence, it is crucial for designers of such vehicles and services to consider the needs of non-average users from the begin on. In this paper, we present a design framework that helps designers taking the perspective and thinking of the needs of non-average users. We present a set of exemplary applications from the literature and interviews and show how they fit into the framework, indicating room for further developments. We further demonstrate how the framework supports in designing a mobility service in a fictional design process. Overall, our work contributes to universal design of future mobility.","PeriodicalId":284749,"journal":{"name":"Proceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116326701","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Marcel Woide, Mark Colley, Nicole Damm, M. Baumann
{"title":"Effect of System Capability Verification on Conflict, Trust, and Behavior in Automated Vehicles","authors":"Marcel Woide, Mark Colley, Nicole Damm, M. Baumann","doi":"10.1145/3543174.3545253","DOIUrl":"https://doi.org/10.1145/3543174.3545253","url":null,"abstract":"With automated driving, vehicles are no longer just tools but become teammates, which enable an increasing space of new interaction possibilities. By changing the relationship between the drivers and the automated vehicles (AV), conflicts regarding maneuver selection can occur. Conflicts can lead to safety-critical takeovers by the drivers. Current research mainly focuses on information requirements for takeovers, only a few works explored the factors necessary for automation engagement. Therefore, a fix-based driving simulator study with N=28 participants was conducted to investigate how verifiable information influences automation engagement, gaze behavior, trust, conflict, criticality, stress, and interaction perception. The results indicate, if drivers can verify the information given by the system, they perceive less conflict and more trust in the system, leading to a lower rejection frequency of an overtaking maneuver performed by an AV. The results indicate that systems that aim to prevent drivers initiated interventions should provide verifiable information.","PeriodicalId":284749,"journal":{"name":"Proceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128098223","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yiyuan Wang, L. Hespanhol, Stewart Worrall, M. Tomitsch
{"title":"Pedestrian-Vehicle Interaction in Shared Space: Insights for Autonomous Vehicles","authors":"Yiyuan Wang, L. Hespanhol, Stewart Worrall, M. Tomitsch","doi":"10.1145/3543174.3546838","DOIUrl":"https://doi.org/10.1145/3543174.3546838","url":null,"abstract":"Shared space reduces segregation between vehicles and pedestrians and encourages them to share roads without imposed traffic rules. The behaviour of road users (RUs) is then controlled by social norms, and interactions are more versatile than on traditional roads. Autonomous vehicles (AVs) will need to adapt to these norms to become socially acceptable RUs in shared spaces. However, to date, there is not much research into pedestrian-vehicle interaction in shared-space environments, and prior efforts have predominantly focused on traditional roads and crossing scenarios. We present a video observation investigating pedestrian reactions to a small, automation-capable vehicle driven manually in shared spaces based on a long-term naturalistic driving dataset. We report various pedestrian reactions (from movement adjustment to prosocial behaviour) and situations pertinent to shared spaces at this early stage. Insights drawn can serve as a foundation to support future AVs navigating shared spaces, especially those with a high pedestrian focus.","PeriodicalId":284749,"journal":{"name":"Proceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications","volume":"56 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123309755","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Debargha Dey, Coen de Zeeuw, Miguel Bruns, Brady Michael Kuhl, Bastian Pfleging
{"title":"Shape-Changing Interfaces in the Automotive Context: A Taxonomy to Aid the Systematic Development of Intuitive Gesture-Based eHMIs","authors":"Debargha Dey, Coen de Zeeuw, Miguel Bruns, Brady Michael Kuhl, Bastian Pfleging","doi":"10.1145/3543174.3546085","DOIUrl":"https://doi.org/10.1145/3543174.3546085","url":null,"abstract":"This paper presents a structured framework of automotive shape-changing interfaces, which can act as a guide for researchers and practitioners in the automotive user interface domain towards designing for interactions between vulnerable road users (VRU) and automated vehicles (AV). Recent research in external human-machine interfaces (eHMI) for facilitating AV-VRU interactions have looked into the potential of external shape-change (eSC) as a means of intuitive communication of an AV’s intent, and calls for more structured design explorations. To systematize this unstructured design space, this paper presents an overview of how shape-change is currently implemented and executed in the automotive context, for communication purposes and beyond. The paper has two contributions: (1) an examination of the state-of-the-art of automotive shape-changing interfaces, and (2) a reusable taxonomy which can be used to structure the design space and identify the potentials of shape-changing interfaces towards more intuitive eHMIs.","PeriodicalId":284749,"journal":{"name":"Proceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129050465","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Henrik Detjen, S. Faltaous, Jonas Keppel, Marvin Prochazka, Uwe Gruenefeld, Shadan Sadeghian, Stefan Schneegass
{"title":"Investigating the Influence of Gaze- and Context-Adaptive Head-up Displays on Take-Over Requests","authors":"Henrik Detjen, S. Faltaous, Jonas Keppel, Marvin Prochazka, Uwe Gruenefeld, Shadan Sadeghian, Stefan Schneegass","doi":"10.1145/3543174.3546089","DOIUrl":"https://doi.org/10.1145/3543174.3546089","url":null,"abstract":"In Level 3 automated vehicles, preparing drivers for take-over requests (TORs) on the head-up display (HUD) requires their repeated attention. Visually salient HUD elements can distract attention from potentially critical parts in a driving scene during a TOR. Further, attention is (a) meanwhile needed for non-driving-related activities and can (b) be over-requested. In this paper, we conduct a driving simulator study (N=12), varying required attention by HUD warning presence (absent vs. constant vs. TOR-only) across gaze-adaptivity (with vs. without) to fit warnings to the situation. We found that (1) drivers value visual support during TORs, (2) gaze-adaptive scene complexity reduction works but creates a benefit-neutralizing distraction for some, and (3) drivers perceive constant HUD warnings as annoying and distracting over time. Our findings highlight the need for (a) HUD adaptation based on user activities and potential TORs and (b) sparse use of warning cues in future HUD designs.","PeriodicalId":284749,"journal":{"name":"Proceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123561352","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Chia-Ming Chang, Koki Toda, Xinyue Gui, S. Seo, T. Igarashi
{"title":"Can Eyes on a Car Reduce Traffic Accidents?","authors":"Chia-Ming Chang, Koki Toda, Xinyue Gui, S. Seo, T. Igarashi","doi":"10.1145/3543174.3546841","DOIUrl":"https://doi.org/10.1145/3543174.3546841","url":null,"abstract":"Various car manufacturers and researchers have explored the idea of adding eyes to a car as an additional communication modality. A previous study demonstrated that autonomous vehicles’ (AVs) eyes help pedestrians make faster street-crossing decisions. In this study, we examine a more critical question, \"can eyes reduce traffic accidents?” To answer this question, we consider a critical street-crossing situation in which a pedestrian is in a hurry to cross the street. If the car is not looking at the pedestrian, this implies that the car does not recognize the pedestrian. Thus, pedestrians can judge that they should not cross the street, thereby avoiding potential traffic accidents. We conducted an empirical study using 360-degree video shooting of an actual car with robotic eyes. The results showed that the eyes can reduce potential traffic accidents and that gaze direction can increase pedestrians’ subjective feelings of safety and danger. In addition, the results showed gender differences in critical and noncritical scenarios in AV-to-pedestrian interaction.","PeriodicalId":284749,"journal":{"name":"Proceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications","volume":"165 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129332590","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
P. Bazilinskyy, L. Kooijman, Dimitra Dodou, Kirsten Mallant, Victor Roosens, M. Middelweerd, Lucas Overbeek, J. D. de Winter
{"title":"Get Out of The Way! Examining eHMIs in Critical Driver-Pedestrian Encounters in a Coupled Simulator","authors":"P. Bazilinskyy, L. Kooijman, Dimitra Dodou, Kirsten Mallant, Victor Roosens, M. Middelweerd, Lucas Overbeek, J. D. de Winter","doi":"10.1145/3543174.3546849","DOIUrl":"https://doi.org/10.1145/3543174.3546849","url":null,"abstract":"Past research suggests that displays on the exterior of the car, known as eHMIs, can be effective in helping pedestrians to make safe crossing decisions. This study examines a new application of eHMIs, namely the provision of directional information in scenarios where the pedestrian is almost hit by a car. In an experiment using a head-mounted display and a motion suit, participants had to cross the road while a car driven by another participant approached them. The results showed that the directional eHMI caused pedestrians to step back compared to no eHMI. The eHMI increased the pedestrians’ self-reported understanding of the car's intention, although some pedestrians did not notice the eHMI. In conclusion, there may be potential for supporting pedestrians in situations where they need support the most, namely critical encounters. Future research may consider coupling a directional eHMI to autonomous emergency steering.","PeriodicalId":284749,"journal":{"name":"Proceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129590687","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Xinyue Gui, Koki Toda, S. Seo, Chia-Ming Chang, T. Igarashi
{"title":"“I am going this way”: Gazing Eyes on Self-Driving Car Show Multiple Driving Directions","authors":"Xinyue Gui, Koki Toda, S. Seo, Chia-Ming Chang, T. Igarashi","doi":"10.1145/3543174.3545251","DOIUrl":"https://doi.org/10.1145/3543174.3545251","url":null,"abstract":"Modern cars express three moving directions (left, right, straight) using turn signals (i.e., blinkers), which is insufficient when multiple paths are toward the same side. As such, drivers give additional hints (e.g., gesture, eye contact) in the conventional car-to-pedestrian interaction. As more self-driving cars without drivers join the public roads, we need additional communication channels. In this work, we discussed the problem of self-driving cars expressing their fine-grained moving direction to pedestrians in addition to blinkers. We built anthropomorphic robotic eyes and mounted them on a real car. We applied the eye gazing technique with the common knowledge: I gaze at the direction I am heading to. We found that the eyes can convey fine-grained directions from our formal VR-based user study, where participants could distinguish five directions with a lower error rate and less time compared to the conventional turn signals.","PeriodicalId":284749,"journal":{"name":"Proceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications","volume":"111 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114758941","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Martina Schuß, Carina Manger, Andreas Löcken, A. Riener
{"title":"You’ll Never Ride Alone: Insights into Women’s Security Needs in Shared Automated Vehicles","authors":"Martina Schuß, Carina Manger, Andreas Löcken, A. Riener","doi":"10.1145/3543174.3546848","DOIUrl":"https://doi.org/10.1145/3543174.3546848","url":null,"abstract":"Shared automated vehicles (SAV) are expected to benefit society and the environment as vehicles and rides are shared among passengers. However, this requires acceptance by different types of people. Recent research confirms that women and older people are particularly concerned about this mobility form due to security reasons. These concerns must be considered to assure the adoption of SAVs from women and senior citizens, too. Accordingly, we conducted a qualitative user study (N=21) using participatory design methods. Our work contributes insights into women’s security needs by taking a holistic view of a ride with an SAV from booking to arrival from the perspective of women of different age groups. From our results, we derived general design implications and propose three concrete concepts for high levels of security. Lastly, we present a research agenda for further investigation on security concepts in SAVs.","PeriodicalId":284749,"journal":{"name":"Proceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications","volume":"59 6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132439064","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Lei Hsiung, Yung-Ju Chang, Wei-Ko Li, Tsung-Yi Ho, Shan-Hung Wu
{"title":"A Lab-Based Investigation of Reaction Time and Reading Performance using Different In-Vehicle Reading Interfaces during Self-Driving","authors":"Lei Hsiung, Yung-Ju Chang, Wei-Ko Li, Tsung-Yi Ho, Shan-Hung Wu","doi":"10.1145/3543174.3545254","DOIUrl":"https://doi.org/10.1145/3543174.3545254","url":null,"abstract":"The demand for autonomous vehicles (AVs) is rapidly growing these years. As AVs have a potential to free drivers’ cognitive resources from driving to other tasks, reading is one of the common activities users conduct in travel multitasking. Nevertheless, ways to supporting reading in AVs have been little explored. To fill this gap, we explored the design of an in-vehicle reader on a windshield in AVs along three dimensions: dynamics, position, and text segmentation. We conducted two in-lab within-subject experiments to examine the eight kinds of in-car reading modalities that represented the combinations of the three dimensions in terms of drivers’ reaction time and reading comprehension. Our results show a case where an adaptive positioning would be particularly beneficial for supporting reading in AVs. And our general suggestion is to use a static reading zone presented on-sky and in sentences because it leads to faster reaction and better reading comprehension.","PeriodicalId":284749,"journal":{"name":"Proceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications","volume":"60 3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121924138","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}