AnnoTainted: Automating Physical Activity Ground Truth Collection Using Smartphones

Rahul Majethia, Akshit Singhal, Lakshmi Manasa K, K. Sahiti, Shubhangi Kishore, Vijay Nandwani
{"title":"AnnoTainted: Automating Physical Activity Ground Truth Collection Using Smartphones","authors":"Rahul Majethia, Akshit Singhal, Lakshmi Manasa K, K. Sahiti, Shubhangi Kishore, Vijay Nandwani","doi":"10.1145/2935651.2935653","DOIUrl":null,"url":null,"abstract":"In this work, we provide motivation for a zero-effort crowdsensing task: auto-annotated ground truth collection for physical activity recognition. Data obtained through Smartphones for classification of human activities is prone to discrepancies, which reiterates the need for better and larger activity datasets. Artificial data generation algorithms fail to efficiently generate quality instances for minority data. In the proposed model, crowd-sourced sensor data is classified by a robust classifier built by researchers ground up. We nominate a Generic Classifier with ≥ 95% accuracy for this purpose. Data collection and distribution models which ensure that the crowd client receives non-skewed, quality data from locations with higher degree of activity occurrence are elucidated upon. Also integrated within our proposed model are Location-Specific Classifiers, which can be utilized by developers to optimize on location-specific tasks. Effective validation of classified activities using diverse sensor data streams improves the proposed classifier systems and boosts ground-truth accuracy.","PeriodicalId":139697,"journal":{"name":"Workshop on Physical Analytics","volume":"41 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Workshop on Physical Analytics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2935651.2935653","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

In this work, we provide motivation for a zero-effort crowdsensing task: auto-annotated ground truth collection for physical activity recognition. Data obtained through Smartphones for classification of human activities is prone to discrepancies, which reiterates the need for better and larger activity datasets. Artificial data generation algorithms fail to efficiently generate quality instances for minority data. In the proposed model, crowd-sourced sensor data is classified by a robust classifier built by researchers ground up. We nominate a Generic Classifier with ≥ 95% accuracy for this purpose. Data collection and distribution models which ensure that the crowd client receives non-skewed, quality data from locations with higher degree of activity occurrence are elucidated upon. Also integrated within our proposed model are Location-Specific Classifiers, which can be utilized by developers to optimize on location-specific tasks. Effective validation of classified activities using diverse sensor data streams improves the proposed classifier systems and boosts ground-truth accuracy.
anno污秽:使用智能手机自动收集体力活动的地面真相
在这项工作中,我们为零努力的众感任务提供了动力:用于身体活动识别的自动注释地面真相收集。通过智能手机获得的用于人类活动分类的数据容易出现差异,这重申了对更好、更大的活动数据集的需求。人工数据生成算法不能有效地生成少数数据的高质量实例。在提出的模型中,由研究人员建立的鲁棒分类器对众包传感器数据进行分类。为此,我们提名一个准确率≥95%的通用分类器。阐明了数据收集和分布模型,以确保人群客户端从活动发生程度较高的位置接收非偏斜的高质量数据。我们提出的模型中还集成了特定于位置的分类器,开发人员可以利用它来优化特定于位置的任务。使用不同传感器数据流的分类活动的有效验证改进了所提出的分类器系统并提高了地面真实度的准确性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信