V. Zaliva, William Melicher, Shayan Saha, J. Zhang
{"title":"被动用户识别使用触摸屏使用模式的接近信息的顺序分析","authors":"V. Zaliva, William Melicher, Shayan Saha, J. Zhang","doi":"10.1109/ICMU.2015.7061060","DOIUrl":null,"url":null,"abstract":"Modern touch screen sensors are capable of detecting and reporting finger presence not only upon contact but also as the finger is approaching the screen. This gives us a wealth of additional information, which to the best of our knowledge, has never been analyzed before. Using these new sensor capabilities, we can see exactly how a user performs gestures starting from the finger's approach through the actual touching of the screen. We decode proximity data which we collect from the mobile phone sensor and extract finger “traces” from each user along with the contact area shapes, which we use to distinguish between the owner and one of the other users. To further improve the classifier's accuracy, we develop a sequential classification approach using a probability ratio test of artificial neural network outputs which makes a decision in minimal time based on predefined accuracy goals. The data not only allows discrimination between users but also detection of their dominant hand. These techniques could be used in many practical applications, such as passive user authentication or personalization. Our experiments show that after just 5 touches, or in 12.6 seconds on average, we can correctly distinguish the primary user from any of 14 other known users using proximity data to model the finger's approach pattern.","PeriodicalId":251023,"journal":{"name":"2015 Eighth International Conference on Mobile Computing and Ubiquitous Networking (ICMU)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":"{\"title\":\"Passive user identification using sequential analysis of proximity information in touchscreen usage patterns\",\"authors\":\"V. Zaliva, William Melicher, Shayan Saha, J. Zhang\",\"doi\":\"10.1109/ICMU.2015.7061060\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Modern touch screen sensors are capable of detecting and reporting finger presence not only upon contact but also as the finger is approaching the screen. This gives us a wealth of additional information, which to the best of our knowledge, has never been analyzed before. Using these new sensor capabilities, we can see exactly how a user performs gestures starting from the finger's approach through the actual touching of the screen. We decode proximity data which we collect from the mobile phone sensor and extract finger “traces” from each user along with the contact area shapes, which we use to distinguish between the owner and one of the other users. To further improve the classifier's accuracy, we develop a sequential classification approach using a probability ratio test of artificial neural network outputs which makes a decision in minimal time based on predefined accuracy goals. The data not only allows discrimination between users but also detection of their dominant hand. These techniques could be used in many practical applications, such as passive user authentication or personalization. Our experiments show that after just 5 touches, or in 12.6 seconds on average, we can correctly distinguish the primary user from any of 14 other known users using proximity data to model the finger's approach pattern.\",\"PeriodicalId\":251023,\"journal\":{\"name\":\"2015 Eighth International Conference on Mobile Computing and Ubiquitous Networking (ICMU)\",\"volume\":\"12 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-03-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"7\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2015 Eighth International Conference on Mobile Computing and Ubiquitous Networking (ICMU)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICMU.2015.7061060\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 Eighth International Conference on Mobile Computing and Ubiquitous Networking (ICMU)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMU.2015.7061060","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Passive user identification using sequential analysis of proximity information in touchscreen usage patterns
Modern touch screen sensors are capable of detecting and reporting finger presence not only upon contact but also as the finger is approaching the screen. This gives us a wealth of additional information, which to the best of our knowledge, has never been analyzed before. Using these new sensor capabilities, we can see exactly how a user performs gestures starting from the finger's approach through the actual touching of the screen. We decode proximity data which we collect from the mobile phone sensor and extract finger “traces” from each user along with the contact area shapes, which we use to distinguish between the owner and one of the other users. To further improve the classifier's accuracy, we develop a sequential classification approach using a probability ratio test of artificial neural network outputs which makes a decision in minimal time based on predefined accuracy goals. The data not only allows discrimination between users but also detection of their dominant hand. These techniques could be used in many practical applications, such as passive user authentication or personalization. Our experiments show that after just 5 touches, or in 12.6 seconds on average, we can correctly distinguish the primary user from any of 14 other known users using proximity data to model the finger's approach pattern.