Hyunjae Gil, Hyungki Son, Jin Ryong Kim, Ian Oakley
{"title":"胡须:探索在面部使用超声波触觉线索","authors":"Hyunjae Gil, Hyungki Son, Jin Ryong Kim, Ian Oakley","doi":"10.1145/3173574.3174232","DOIUrl":null,"url":null,"abstract":"Haptic cues are a valuable feedback mechanism for smart glasses. Prior work has shown how they can support navigation, deliver notifications and cue targets. However, a focus on actuation technologies such as mechanical tactors or fans has restricted the scope of research to a small number of cues presented at fixed locations. To move beyond this limitation, we explore perception of in-air ultrasonic haptic cues on the face. We present two studies examining the fundamental properties of localization, duration and movement perception on three facial sites suitable for use with glasses: the cheek, the center of the forehead, and above the eyebrow. The center of the forehead led to optimal performance with a localization error of 3.77mm and accurate duration (80%) and movement perception (87%). We apply these findings in a study delivering eight different ultrasonic notifications and report mean recognition rates of up to 92.4% (peak: 98.6%). We close with design recommendations for ultrasonic haptic cues on the face.","PeriodicalId":20512,"journal":{"name":"Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems","volume":"54 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2018-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"28","resultStr":"{\"title\":\"Whiskers: Exploring the Use of Ultrasonic Haptic Cues on the Face\",\"authors\":\"Hyunjae Gil, Hyungki Son, Jin Ryong Kim, Ian Oakley\",\"doi\":\"10.1145/3173574.3174232\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Haptic cues are a valuable feedback mechanism for smart glasses. Prior work has shown how they can support navigation, deliver notifications and cue targets. However, a focus on actuation technologies such as mechanical tactors or fans has restricted the scope of research to a small number of cues presented at fixed locations. To move beyond this limitation, we explore perception of in-air ultrasonic haptic cues on the face. We present two studies examining the fundamental properties of localization, duration and movement perception on three facial sites suitable for use with glasses: the cheek, the center of the forehead, and above the eyebrow. The center of the forehead led to optimal performance with a localization error of 3.77mm and accurate duration (80%) and movement perception (87%). We apply these findings in a study delivering eight different ultrasonic notifications and report mean recognition rates of up to 92.4% (peak: 98.6%). We close with design recommendations for ultrasonic haptic cues on the face.\",\"PeriodicalId\":20512,\"journal\":{\"name\":\"Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems\",\"volume\":\"54 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-04-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"28\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3173574.3174232\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3173574.3174232","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Whiskers: Exploring the Use of Ultrasonic Haptic Cues on the Face
Haptic cues are a valuable feedback mechanism for smart glasses. Prior work has shown how they can support navigation, deliver notifications and cue targets. However, a focus on actuation technologies such as mechanical tactors or fans has restricted the scope of research to a small number of cues presented at fixed locations. To move beyond this limitation, we explore perception of in-air ultrasonic haptic cues on the face. We present two studies examining the fundamental properties of localization, duration and movement perception on three facial sites suitable for use with glasses: the cheek, the center of the forehead, and above the eyebrow. The center of the forehead led to optimal performance with a localization error of 3.77mm and accurate duration (80%) and movement perception (87%). We apply these findings in a study delivering eight different ultrasonic notifications and report mean recognition rates of up to 92.4% (peak: 98.6%). We close with design recommendations for ultrasonic haptic cues on the face.