AFAR: A Real-Time Vision-based Activity Monitoring and Fall Detection Framework using 1D Convolutional Neural Networks

J. Suarez, Nathaniel S. Orillaza, P. Naval
{"title":"AFAR: A Real-Time Vision-based Activity Monitoring and Fall Detection Framework using 1D Convolutional Neural Networks","authors":"J. Suarez, Nathaniel S. Orillaza, P. Naval","doi":"10.1145/3529836.3529862","DOIUrl":null,"url":null,"abstract":"In recent years, there has been an increased interest in the use of telemedicine as an option to avail proper healthcare. However, one of the main issues of activity monitoring and fall detection in telehealth systems is the scalability of the technology for areas with inadequate technology infrastructure. As a potential solution, this study proposes an efficient activity monitoring and fall detection framework which can run real-time on CPU devices. In comparison to previous works, this study makes use of an efficient pose estimator called MediaPipe and leverages the pose joints as the main inputs of the model for activity monitoring and fall detection. This allows the framework to be used on cost-effective devices. To ensure the quality of the framework, it was evaluated on three (3) publicly available datasets: Adhikari Dataset, UP Fall Dataset, and UR Fall Dataset by looking at accuracy, precision, recall, and F1 scores. Based from the results, the framework was able to achieve both state-of-the-art and real-time performance on these datasets.","PeriodicalId":285191,"journal":{"name":"2022 14th International Conference on Machine Learning and Computing (ICMLC)","volume":"56 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-02-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 14th International Conference on Machine Learning and Computing (ICMLC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3529836.3529862","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

In recent years, there has been an increased interest in the use of telemedicine as an option to avail proper healthcare. However, one of the main issues of activity monitoring and fall detection in telehealth systems is the scalability of the technology for areas with inadequate technology infrastructure. As a potential solution, this study proposes an efficient activity monitoring and fall detection framework which can run real-time on CPU devices. In comparison to previous works, this study makes use of an efficient pose estimator called MediaPipe and leverages the pose joints as the main inputs of the model for activity monitoring and fall detection. This allows the framework to be used on cost-effective devices. To ensure the quality of the framework, it was evaluated on three (3) publicly available datasets: Adhikari Dataset, UP Fall Dataset, and UR Fall Dataset by looking at accuracy, precision, recall, and F1 scores. Based from the results, the framework was able to achieve both state-of-the-art and real-time performance on these datasets.
基于一维卷积神经网络的实时视觉活动监测和跌倒检测框架
近年来,人们对使用远程医疗作为获得适当保健的一种选择越来越感兴趣。然而,远程医疗系统中活动监测和跌倒检测的主要问题之一是技术基础设施不足地区的可扩展性。作为一种潜在的解决方案,本研究提出了一种高效的活动监测和跌倒检测框架,该框架可以在CPU设备上实时运行。与之前的工作相比,本研究使用了一种名为MediaPipe的高效姿态估计器,并利用姿态关节作为模型的主要输入,用于活动监测和跌倒检测。这使得该框架可以用于具有成本效益的设备。为了确保框架的质量,对三(3)个公开可用的数据集进行了评估:Adhikari数据集、UP Fall数据集和UR Fall数据集,通过查看准确性、精密度、召回率和F1分数。根据结果,该框架能够在这些数据集上实现最先进和实时的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信