视觉和加速模式:识别复杂活动的伙伴

Alexander Diete, T. Sztyler, H. Stuckenschmidt
{"title":"视觉和加速模式:识别复杂活动的伙伴","authors":"Alexander Diete, T. Sztyler, H. Stuckenschmidt","doi":"10.1109/PERCOMW.2019.8730690","DOIUrl":null,"url":null,"abstract":"Wearable devices have been used widely for human activity recognition in the field of pervasive computing. One big area of in this research is the recognition of activities of daily living where especially inertial and interaction sensors like RFID tags and scanners have been used. An issue that may arise when using interaction sensors is a lack of certainty. A positive signal from an interaction sensor is not necessarily caused by a performed activity e.g, when an object is only touched but no interaction occurred afterwards. In our work, we aim to overcome this limitation and present a multi-modal egocentric-based activity recognition approach which is able to recognize the critical activities by looking at movement and object information at the same time. We present our results of combining inertial and video features to recognize human activities on different types of scenarios where we achieve a $F_{1}$-measure up to 79.6%.","PeriodicalId":437017,"journal":{"name":"2019 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops)","volume":"71 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":"{\"title\":\"Vision and Acceleration Modalities: Partners for Recognizing Complex Activities\",\"authors\":\"Alexander Diete, T. Sztyler, H. Stuckenschmidt\",\"doi\":\"10.1109/PERCOMW.2019.8730690\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Wearable devices have been used widely for human activity recognition in the field of pervasive computing. One big area of in this research is the recognition of activities of daily living where especially inertial and interaction sensors like RFID tags and scanners have been used. An issue that may arise when using interaction sensors is a lack of certainty. A positive signal from an interaction sensor is not necessarily caused by a performed activity e.g, when an object is only touched but no interaction occurred afterwards. In our work, we aim to overcome this limitation and present a multi-modal egocentric-based activity recognition approach which is able to recognize the critical activities by looking at movement and object information at the same time. We present our results of combining inertial and video features to recognize human activities on different types of scenarios where we achieve a $F_{1}$-measure up to 79.6%.\",\"PeriodicalId\":437017,\"journal\":{\"name\":\"2019 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops)\",\"volume\":\"71 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-03-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"8\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/PERCOMW.2019.8730690\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PERCOMW.2019.8730690","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8

摘要

在普适计算领域,可穿戴设备已被广泛用于人体活动识别。这项研究的一个重要领域是对日常生活活动的识别,特别是惯性和交互传感器,如RFID标签和扫描仪已经被使用。使用交互传感器时可能出现的一个问题是缺乏确定性。来自交互传感器的积极信号不一定是由已执行的活动引起的,例如,当物体仅被触摸但随后没有发生交互时。在我们的工作中,我们的目标是克服这一限制,并提出了一种多模态以自我为中心的活动识别方法,该方法能够通过同时查看运动和对象信息来识别关键活动。我们展示了结合惯性和视频特征来识别不同类型场景下人类活动的结果,我们实现了高达79.6%的F_{1} -测量。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Vision and Acceleration Modalities: Partners for Recognizing Complex Activities
Wearable devices have been used widely for human activity recognition in the field of pervasive computing. One big area of in this research is the recognition of activities of daily living where especially inertial and interaction sensors like RFID tags and scanners have been used. An issue that may arise when using interaction sensors is a lack of certainty. A positive signal from an interaction sensor is not necessarily caused by a performed activity e.g, when an object is only touched but no interaction occurred afterwards. In our work, we aim to overcome this limitation and present a multi-modal egocentric-based activity recognition approach which is able to recognize the critical activities by looking at movement and object information at the same time. We present our results of combining inertial and video features to recognize human activities on different types of scenarios where we achieve a $F_{1}$-measure up to 79.6%.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信