Impersonal smartphone-based activity recognition using the accelerometer sensory data

Therdsak Dungkaew, J. Suksawatchon, U. Suksawatchon
{"title":"Impersonal smartphone-based activity recognition using the accelerometer sensory data","authors":"Therdsak Dungkaew, J. Suksawatchon, U. Suksawatchon","doi":"10.1109/INCIT.2017.8257856","DOIUrl":null,"url":null,"abstract":"Smartphone-based activity recognition focuses on identifying the current activities of a mobile user by employing the sensory data which are available on smartphones. A lightweight model and less inquiry users for true activities, are necessary for deploying the activity recognition on a mobile platform for identifying activities based on new sensory data in real time. In this paper, we propose a new smartphone-based activity recognition framework for evolving sensory data stream called ISAR. It stands for Impersonal Smartphone-based Activity Recognition. ISAR model is built using annotated sensory data from a panel of user as training data and are applied to the new users. Our new model is an offline and online phase. In offline phase, we propose a new method for finding the threshold value which used to distinguish between dormant activities and energetic activities. Only a set of the energetic activities are used to build a light-weight classifier model. In online phase, we introduce the recognition technique of unannotated streaming sensory data with different activities. The experimental results using real human activity recognition data have conducted and compared with STAR model in terms of the accuracy and time complexity. Our results indicates that ISAR model can perform dramatically better than STAR model. Moreover, ISAR can utilize better than STAR model in real situation, especially across different users and without inquiry users.","PeriodicalId":405827,"journal":{"name":"2017 2nd International Conference on Information Technology (INCIT)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 2nd International Conference on Information Technology (INCIT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/INCIT.2017.8257856","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 10

Abstract

Smartphone-based activity recognition focuses on identifying the current activities of a mobile user by employing the sensory data which are available on smartphones. A lightweight model and less inquiry users for true activities, are necessary for deploying the activity recognition on a mobile platform for identifying activities based on new sensory data in real time. In this paper, we propose a new smartphone-based activity recognition framework for evolving sensory data stream called ISAR. It stands for Impersonal Smartphone-based Activity Recognition. ISAR model is built using annotated sensory data from a panel of user as training data and are applied to the new users. Our new model is an offline and online phase. In offline phase, we propose a new method for finding the threshold value which used to distinguish between dormant activities and energetic activities. Only a set of the energetic activities are used to build a light-weight classifier model. In online phase, we introduce the recognition technique of unannotated streaming sensory data with different activities. The experimental results using real human activity recognition data have conducted and compared with STAR model in terms of the accuracy and time complexity. Our results indicates that ISAR model can perform dramatically better than STAR model. Moreover, ISAR can utilize better than STAR model in real situation, especially across different users and without inquiry users.
基于非个人智能手机的活动识别,使用加速度计感知数据
基于智能手机的活动识别侧重于通过使用智能手机上可用的感官数据来识别移动用户的当前活动。在移动平台上部署基于新感官数据实时识别活动的活动识别,需要轻量级模型和较少的真实活动查询用户。在本文中,我们提出了一种新的基于智能手机的活动识别框架,用于不断发展的感官数据流,称为ISAR。它代表基于非个人智能手机的活动识别。ISAR模型是利用来自用户面板的带注释的感官数据作为训练数据建立的,并应用于新用户。我们的新模式是线下和线上相结合。在离线阶段,我们提出了一种新的阈值寻找方法,用于区分休眠活动和活跃活动。仅使用一组能量活动来构建轻量级分类器模型。在在线阶段,我们介绍了不同活动的无标注流传感数据的识别技术。利用真实人体活动识别数据进行实验,并与STAR模型在准确率和时间复杂度方面进行了比较。结果表明,ISAR模型的性能明显优于STAR模型。此外,ISAR模型在实际应用中具有比STAR模型更好的利用效果,特别是在跨用户和无查询用户的情况下。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信