A multimodal data set for evaluating continuous authentication performance in smartphones

Q. Yang, Ge Peng, David T. Nguyen, Xin Qi, Gang Zhou, Zdenka Sitova, Paolo Gasti, K. Balagani
{"title":"A multimodal data set for evaluating continuous authentication performance in smartphones","authors":"Q. Yang, Ge Peng, David T. Nguyen, Xin Qi, Gang Zhou, Zdenka Sitova, Paolo Gasti, K. Balagani","doi":"10.1145/2668332.2668366","DOIUrl":null,"url":null,"abstract":"Continuous authentication modalities allow a device to authenticate users transparently without interrupting them or requiring their attention. This is especially important on smartphones, which are more prone to be lost or stolen than regular computers, and carry plenty of sensitive information. There is a multitude of signals that can be harnessed for continuous authentication on mobile devices, such as touch input, accelerometer, and gyroscope, etc. However, existing public datasets include only a handful of them, limiting the ability to do experiments that involve multiple modalities. To fill this gap, we performed a large-scale user study to collect a wide spectrum of signals on smartphones. Our dataset combines more modalities than existing datasets, including movement, orientation, touch, gestures, and pausality. This dataset has been used to evaluate our new behavioral modality named Hand Movement, Orientation, and Grasp (H-MOG). This poster reports on the data collection process and outcomes, as well as preliminary authentication results.","PeriodicalId":223777,"journal":{"name":"Proceedings of the 12th ACM Conference on Embedded Network Sensor Systems","volume":"70 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-11-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"41","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 12th ACM Conference on Embedded Network Sensor Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2668332.2668366","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 41

Abstract

Continuous authentication modalities allow a device to authenticate users transparently without interrupting them or requiring their attention. This is especially important on smartphones, which are more prone to be lost or stolen than regular computers, and carry plenty of sensitive information. There is a multitude of signals that can be harnessed for continuous authentication on mobile devices, such as touch input, accelerometer, and gyroscope, etc. However, existing public datasets include only a handful of them, limiting the ability to do experiments that involve multiple modalities. To fill this gap, we performed a large-scale user study to collect a wide spectrum of signals on smartphones. Our dataset combines more modalities than existing datasets, including movement, orientation, touch, gestures, and pausality. This dataset has been used to evaluate our new behavioral modality named Hand Movement, Orientation, and Grasp (H-MOG). This poster reports on the data collection process and outcomes, as well as preliminary authentication results.
用于评估智能手机连续身份验证性能的多模态数据集
连续身份验证模式允许设备透明地对用户进行身份验证,而不会打断他们或要求他们注意。这一点在智能手机上尤为重要,因为智能手机比普通电脑更容易丢失或被盗,而且含有大量敏感信息。有许多信号可以用于移动设备上的连续身份验证,如触摸输入、加速度计和陀螺仪等。然而,现有的公共数据集只包括其中的一小部分,限制了进行涉及多种模式的实验的能力。为了填补这一空白,我们进行了一项大规模的用户研究,以收集智能手机上的广泛信号。我们的数据集比现有的数据集结合了更多的模式,包括运动、方向、触摸、手势和暂停。这个数据集被用来评估我们新的行为模式——手部运动、方向和抓握(H-MOG)。这张海报报告了数据收集的过程和结果,以及初步的认证结果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信