Nicolas D. Herzberger, Marcel Usai, Michael Preutenborbeck
{"title":"Biologically Inspired Automotive User Interfaces for Partially and Highly Automated Maneuver Gestures: Final Results and Outlook","authors":"Nicolas D. Herzberger, Marcel Usai, Michael Preutenborbeck","doi":"10.54941/ahfe100975","DOIUrl":null,"url":null,"abstract":"Automated driving puts severe challenges on the design and testing of automotive user interfaces. In partially automated driving, the driver is still responsible for the vehicle control, but is strongly supported by technology. In highly automated driving, the driver can give control to the automation for a certain time and can get control back e.g., when the automation encounters limits. Despite great technological progress, a truly intuitive way to interact with these automated driving modes is still under research. Project Vorreiter is addressing this by using the inspiration of a rider and a horse to provide intuitive steering gestures on the steering wheel or an alternative device, which initiate maneuvers executed by the automation. These can be supervised, influenced or interrupted by the driver. The gestures are built up in a universal design approach, which helps all drivers, including beginners and drivers with disabilities. After an introduction into the overall philosophy and concept, the contribution focuses on a final step in the project, an overall evaluation of the concept in a driving simulator and presents new data especially on the comparison of swiping gestures and pushing gestures regarding false or true positive or negative detected gestures. Finally, a brief outlook sketches next steps with a new Wizard-of-Oz / theater vehicle.","PeriodicalId":292077,"journal":{"name":"Intelligent Human Systems Integration (IHSI 2022) Integrating People and Intelligent Systems","volume":"36 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Intelligent Human Systems Integration (IHSI 2022) Integrating People and Intelligent Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.54941/ahfe100975","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Automated driving puts severe challenges on the design and testing of automotive user interfaces. In partially automated driving, the driver is still responsible for the vehicle control, but is strongly supported by technology. In highly automated driving, the driver can give control to the automation for a certain time and can get control back e.g., when the automation encounters limits. Despite great technological progress, a truly intuitive way to interact with these automated driving modes is still under research. Project Vorreiter is addressing this by using the inspiration of a rider and a horse to provide intuitive steering gestures on the steering wheel or an alternative device, which initiate maneuvers executed by the automation. These can be supervised, influenced or interrupted by the driver. The gestures are built up in a universal design approach, which helps all drivers, including beginners and drivers with disabilities. After an introduction into the overall philosophy and concept, the contribution focuses on a final step in the project, an overall evaluation of the concept in a driving simulator and presents new data especially on the comparison of swiping gestures and pushing gestures regarding false or true positive or negative detected gestures. Finally, a brief outlook sketches next steps with a new Wizard-of-Oz / theater vehicle.