Comparing semiautomatic Rapid Upper Limb Assessments (RULA): Azure Kinect versus RGB-based machine vision algorithm

Antonio Maria Coruzzolo, Francesco Lolli, Nazareno Amicosante, Hrishikesh Kumar, P. Thupaki, S. Agarwal
{"title":"Comparing semiautomatic Rapid Upper Limb Assessments (RULA): Azure Kinect versus RGB-based machine vision algorithm","authors":"Antonio Maria Coruzzolo, Francesco Lolli, Nazareno Amicosante, Hrishikesh Kumar, P. Thupaki, S. Agarwal","doi":"10.54941/ahfe1002596","DOIUrl":null,"url":null,"abstract":"Correctly using a rapid upper limb assessment for working postures is crucial to avoid musculoskeletal disorders. Although motion capture technologies and in particular depth cameras are widely used, they cannot be used in large-scale industrial environments due to their high cost and their performance greatly impacted by the surrounding environment. We thus compared the effectiveness of a commercial machine vision algorithm (named ErgoEdge) based on an RGB camera against an application here developed based on the depth camera Microsoft Azure Kinect for the RULA evaluation (AzKRULA). We conducted an experiment where fifteen static postures were evaluated with Microsoft Azure Kinect and ErgoEdge, and the results were also compared with those of an expert in ergonomics. This experiment showed a substantial agreement between the solutions provided by the semi-automatic RULA evaluation and the ergonomic expert and between AzKRULA and ErgoEdge. At the same time, it showed that the RGB camera must be placed on the side of the worker due to the difficulties of the machine vision algorithm in reconstructing from a frontal view, important joint angles in 2D space (e.g., to evaluate the neck and trunk), which can invalidate the RULA evaluation provided by ErgoEdge. Moreover, the RULA evaluation with AzKRULA and ErgoEdge highlighted the need for an in-depth study into the thresholds of the secondary factors (i.e., all the factors for the RULA evaluation that are not computed from the thresholds of joint angles) as the highest differences between the two evaluations and the ergonomist one arises on them.","PeriodicalId":130337,"journal":{"name":"Physical Ergonomics and Human Factors","volume":"30 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Physical Ergonomics and Human Factors","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.54941/ahfe1002596","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Correctly using a rapid upper limb assessment for working postures is crucial to avoid musculoskeletal disorders. Although motion capture technologies and in particular depth cameras are widely used, they cannot be used in large-scale industrial environments due to their high cost and their performance greatly impacted by the surrounding environment. We thus compared the effectiveness of a commercial machine vision algorithm (named ErgoEdge) based on an RGB camera against an application here developed based on the depth camera Microsoft Azure Kinect for the RULA evaluation (AzKRULA). We conducted an experiment where fifteen static postures were evaluated with Microsoft Azure Kinect and ErgoEdge, and the results were also compared with those of an expert in ergonomics. This experiment showed a substantial agreement between the solutions provided by the semi-automatic RULA evaluation and the ergonomic expert and between AzKRULA and ErgoEdge. At the same time, it showed that the RGB camera must be placed on the side of the worker due to the difficulties of the machine vision algorithm in reconstructing from a frontal view, important joint angles in 2D space (e.g., to evaluate the neck and trunk), which can invalidate the RULA evaluation provided by ErgoEdge. Moreover, the RULA evaluation with AzKRULA and ErgoEdge highlighted the need for an in-depth study into the thresholds of the secondary factors (i.e., all the factors for the RULA evaluation that are not computed from the thresholds of joint angles) as the highest differences between the two evaluations and the ergonomist one arises on them.
比较半自动快速上肢评估(RULA): Azure Kinect与基于rgb的机器视觉算法
正确使用快速上肢评估工作姿势是避免肌肉骨骼疾病的关键。虽然运动捕捉技术,特别是深度相机被广泛应用,但由于其成本高,性能受周围环境影响较大,无法在大规模工业环境中使用。因此,我们比较了基于RGB相机的商业机器视觉算法(名为ErgoEdge)与基于深度相机Microsoft Azure Kinect开发的应用程序的有效性,以进行RULA评估(AzKRULA)。我们进行了一个实验,用微软Azure Kinect和ErgoEdge评估了15种静态姿势,并将结果与人体工程学专家的结果进行了比较。该实验表明,半自动RULA评估和人体工程学专家提供的解决方案以及AzKRULA和ErgoEdge之间存在实质性的一致。同时,它表明,由于机器视觉算法难以从正面视图重建二维空间中的重要关节角度(例如评估颈部和躯干),RGB相机必须放置在工人的侧面,这可能会使ErgoEdge提供的RULA评估无效。此外,AzKRULA和ErgoEdge的RULA评估强调需要深入研究次要因素的阈值(即,所有用于RULA评估的因素都不是从关节角度的阈值计算的),因为这两种评估与人体工程学评估之间的最大差异出现在它们之上。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信