CNNAuth: Continuous Authentication via Two-Stream Convolutional Neural Networks

Hailong Hu, Yantao Li, Zhangqian Zhu, Gang Zhou
{"title":"CNNAuth: Continuous Authentication via Two-Stream Convolutional Neural Networks","authors":"Hailong Hu, Yantao Li, Zhangqian Zhu, Gang Zhou","doi":"10.1109/NAS.2018.8515693","DOIUrl":null,"url":null,"abstract":"We present a two-stream convolutional neural network based authentication system, CNNAuth, for continuously monitoring users' behavioral patterns, by leveraging the accelerometer and gyroscope on smartphones. We are among the first to exploit two streams of the time-domain data and frequency-domain data from raw sensor data for learning and extracting universal effective and efficient feature representations as the inputs of the convolutional neural network (CNN), and the extracted features are further selected by the principal component analysis (PCA). With these features, we use the one-class support vector machine (SVM) to train the classifier in the enrollment phase, and with the trained classifier and testing features, CNNAuth classifies the current user as a legitimate user or an impostor in the continuous authentication phase. We evaluate the performance of the two-stream CNN and CNNAuth, respectively, and the experimental results show that the two-stream CNN achieves an accuracy of 87.14%, and CNNAuth reaches the lowest authentication EER of 2.3% and consumes approximately 3 seconds for authentication.","PeriodicalId":115970,"journal":{"name":"2018 IEEE International Conference on Networking, Architecture and Storage (NAS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"25","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE International Conference on Networking, Architecture and Storage (NAS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NAS.2018.8515693","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 25

Abstract

We present a two-stream convolutional neural network based authentication system, CNNAuth, for continuously monitoring users' behavioral patterns, by leveraging the accelerometer and gyroscope on smartphones. We are among the first to exploit two streams of the time-domain data and frequency-domain data from raw sensor data for learning and extracting universal effective and efficient feature representations as the inputs of the convolutional neural network (CNN), and the extracted features are further selected by the principal component analysis (PCA). With these features, we use the one-class support vector machine (SVM) to train the classifier in the enrollment phase, and with the trained classifier and testing features, CNNAuth classifies the current user as a legitimate user or an impostor in the continuous authentication phase. We evaluate the performance of the two-stream CNN and CNNAuth, respectively, and the experimental results show that the two-stream CNN achieves an accuracy of 87.14%, and CNNAuth reaches the lowest authentication EER of 2.3% and consumes approximately 3 seconds for authentication.
CNNAuth:基于双流卷积神经网络的连续认证
我们提出了一个基于双流卷积神经网络的认证系统,CNNAuth,通过利用智能手机上的加速度计和陀螺仪,连续监测用户的行为模式。我们首先利用原始传感器数据的时域数据和频域数据两种流来学习和提取通用有效和高效的特征表示,作为卷积神经网络(CNN)的输入,并通过主成分分析(PCA)进一步选择提取的特征。利用这些特征,我们在注册阶段使用单类支持向量机(one-class support vector machine, SVM)训练分类器,利用训练好的分类器和测试特征,CNNAuth在持续身份验证阶段将当前用户分类为合法用户或冒名顶替者。我们分别对两流CNN和CNNAuth的性能进行了评估,实验结果表明,两流CNN的准确率达到87.14%,CNNAuth的最低认证EER为2.3%,认证耗时约为3秒。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信