{"title":"TapSongs:在单个二进制传感器上轻敲基于节奏的密码","authors":"J. Wobbrock","doi":"10.1145/1622176.1622194","DOIUrl":null,"url":null,"abstract":"TapSongs are presented, which enable user authentication on a single \"binary\" sensor (e.g., button) by matching the rhythm of tap down/up events to a jingle timing model created by the user. We describe our matching algorithm, which employs absolute match criteria and learns from successful logins. We also present a study of 10 subjects showing that after they created their own TapSong models from 12 examples (< 2 minutes), their subsequent login attempts were 83.2% successful. Furthermore, aural and visual eavesdropping of the experimenter's logins resulted in only 10.7% successful imposter logins by subjects. Even when subjects heard the target jingles played by a synthesized piano, they were only 19.4% successful logging in as imposters. These results are attributable to subtle but reliable individual differences in people's tapping, which are supported by prior findings in music psychology.","PeriodicalId":93361,"journal":{"name":"Proceedings of the ACM Symposium on User Interface Software and Technology. ACM Symposium on User Interface Software and Technology","volume":"1 1","pages":"93-96"},"PeriodicalIF":0.0000,"publicationDate":"2009-10-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"62","resultStr":"{\"title\":\"TapSongs: tapping rhythm-based passwords on a single binary sensor\",\"authors\":\"J. Wobbrock\",\"doi\":\"10.1145/1622176.1622194\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"TapSongs are presented, which enable user authentication on a single \\\"binary\\\" sensor (e.g., button) by matching the rhythm of tap down/up events to a jingle timing model created by the user. We describe our matching algorithm, which employs absolute match criteria and learns from successful logins. We also present a study of 10 subjects showing that after they created their own TapSong models from 12 examples (< 2 minutes), their subsequent login attempts were 83.2% successful. Furthermore, aural and visual eavesdropping of the experimenter's logins resulted in only 10.7% successful imposter logins by subjects. Even when subjects heard the target jingles played by a synthesized piano, they were only 19.4% successful logging in as imposters. These results are attributable to subtle but reliable individual differences in people's tapping, which are supported by prior findings in music psychology.\",\"PeriodicalId\":93361,\"journal\":{\"name\":\"Proceedings of the ACM Symposium on User Interface Software and Technology. ACM Symposium on User Interface Software and Technology\",\"volume\":\"1 1\",\"pages\":\"93-96\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2009-10-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"62\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the ACM Symposium on User Interface Software and Technology. ACM Symposium on User Interface Software and Technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/1622176.1622194\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the ACM Symposium on User Interface Software and Technology. ACM Symposium on User Interface Software and Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/1622176.1622194","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
TapSongs: tapping rhythm-based passwords on a single binary sensor
TapSongs are presented, which enable user authentication on a single "binary" sensor (e.g., button) by matching the rhythm of tap down/up events to a jingle timing model created by the user. We describe our matching algorithm, which employs absolute match criteria and learns from successful logins. We also present a study of 10 subjects showing that after they created their own TapSong models from 12 examples (< 2 minutes), their subsequent login attempts were 83.2% successful. Furthermore, aural and visual eavesdropping of the experimenter's logins resulted in only 10.7% successful imposter logins by subjects. Even when subjects heard the target jingles played by a synthesized piano, they were only 19.4% successful logging in as imposters. These results are attributable to subtle but reliable individual differences in people's tapping, which are supported by prior findings in music psychology.