User independent, multi-modal spotting of subtle arm actions with minimal training data

Gerald Bauer, Ulf Blanke, P. Lukowicz, B. Schiele
{"title":"User independent, multi-modal spotting of subtle arm actions with minimal training data","authors":"Gerald Bauer, Ulf Blanke, P. Lukowicz, B. Schiele","doi":"10.1109/PerComW.2013.6529448","DOIUrl":null,"url":null,"abstract":"We address a specific, particularly difficult class of activity recognition problems defined by (1) subtle, and hardly discriminative hand motions such as a short press or pull, (2) large, ill defined NULL class (any other hand motion a person may express during normal life), and (3) difficulty of collecting sufficient training data, that generalizes well from one to multiple users. In essence we intend to spot activities such as opening a cupboard, pressing a button, or taking an object from a shelve in a large data stream that contains typical every day activity. We focus on body-worn sensors without instrumenting objects, we exploit available infrastructure information, and we perform a one-to-many-users training scheme for minimal training effort. We demonstrate that a state of the art motion sensors based approach performs poorly under such conditions (Equal Error Rate of 18% in our experiments). We present and evaluate a new multi modal system based on a combination of indoor location with a wrist mounted proximity sensor, camera and inertial sensor that raises the EER to 79%.","PeriodicalId":101502,"journal":{"name":"2013 IEEE International Conference on Pervasive Computing and Communications Workshops (PERCOM Workshops)","volume":"21 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 IEEE International Conference on Pervasive Computing and Communications Workshops (PERCOM Workshops)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PerComW.2013.6529448","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

We address a specific, particularly difficult class of activity recognition problems defined by (1) subtle, and hardly discriminative hand motions such as a short press or pull, (2) large, ill defined NULL class (any other hand motion a person may express during normal life), and (3) difficulty of collecting sufficient training data, that generalizes well from one to multiple users. In essence we intend to spot activities such as opening a cupboard, pressing a button, or taking an object from a shelve in a large data stream that contains typical every day activity. We focus on body-worn sensors without instrumenting objects, we exploit available infrastructure information, and we perform a one-to-many-users training scheme for minimal training effort. We demonstrate that a state of the art motion sensors based approach performs poorly under such conditions (Equal Error Rate of 18% in our experiments). We present and evaluate a new multi modal system based on a combination of indoor location with a wrist mounted proximity sensor, camera and inertial sensor that raises the EER to 79%.
用户独立的,多模式的细微的手臂动作与最小的训练数据发现
我们解决了一个特定的,特别困难的一类活动识别问题,其定义为:(1)微妙的,几乎没有区别的手部动作,如短的按压或拉,(2)大的,定义不清的NULL类(一个人在正常生活中可能表达的任何其他手部动作),以及(3)收集足够的训练数据的困难,从一个到多个用户都可以很好地概括。从本质上讲,我们打算在包含典型日常活动的大数据流中发现诸如打开橱柜、按下按钮或从架子上取下物品之类的活动。我们专注于没有仪器对象的身体穿戴传感器,我们利用可用的基础设施信息,我们执行一对多用户培训方案,以减少培训工作量。我们证明,在这种条件下,基于运动传感器的最先进的方法表现不佳(在我们的实验中错误率为18%)。我们提出并评估了一种新的多模态系统,该系统将室内定位与手腕安装的接近传感器、摄像头和惯性传感器相结合,将EER提高到79%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信