Safety-critical human detection featuring time-of-flight environment perception

N. Druml, Bernhard Rutte-Vas, Sandra Wilfling, C. Consani, M. Baumgart, T. Herndl, G. Holweg
{"title":"Safety-critical human detection featuring time-of-flight environment perception","authors":"N. Druml, Bernhard Rutte-Vas, Sandra Wilfling, C. Consani, M. Baumgart, T. Herndl, G. Holweg","doi":"10.1109/ETFA.2017.8247647","DOIUrl":null,"url":null,"abstract":"Industrie 4.0, Industrial IoT, and cyber-physical production systems in general introduce high levels of automation. Hence, at shop floor level, for example, the interfaces between human and machines are crucial. If a robot's environment perception is not robust and fail-safe, humans may not be detected properly, which may cause critical consequences. However, given the resource constraints of safety controllers typically used in critical application domains, the implementation of complex computer vision algorithms is often challenging. Here we explore the capabilities the indirect Time-of-Flight environment perception technology provides in order to speed up complex computer vision algorithms. In particular we are investigating the use-case of human detection in the domains of critical and resource-constrained applications, such as factory automation and automotive. We demonstrate that preprocessing based on Time-of-Flight 3D data can reduce computation time of typically used computer vision algorithms, such as Viola-Jones, by up to 50%. Furthermore, we showcase a human detection demonstrator case-study implemented on an AURIX processing system that represents a state-of-the-art safety controller. By exploiting the Time-of-Flight technology's depth and amplitude data in a clever way, safety-critical human detection is enabled on the AURIX platform. Compared to competing environment perception technologies, the outlined solution is hardly achievable with structured light or stereo visioning due to the safety controller's resource constraints.","PeriodicalId":6522,"journal":{"name":"2017 22nd IEEE International Conference on Emerging Technologies and Factory Automation (ETFA)","volume":"93 1","pages":"1-7"},"PeriodicalIF":0.0000,"publicationDate":"2017-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 22nd IEEE International Conference on Emerging Technologies and Factory Automation (ETFA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ETFA.2017.8247647","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Industrie 4.0, Industrial IoT, and cyber-physical production systems in general introduce high levels of automation. Hence, at shop floor level, for example, the interfaces between human and machines are crucial. If a robot's environment perception is not robust and fail-safe, humans may not be detected properly, which may cause critical consequences. However, given the resource constraints of safety controllers typically used in critical application domains, the implementation of complex computer vision algorithms is often challenging. Here we explore the capabilities the indirect Time-of-Flight environment perception technology provides in order to speed up complex computer vision algorithms. In particular we are investigating the use-case of human detection in the domains of critical and resource-constrained applications, such as factory automation and automotive. We demonstrate that preprocessing based on Time-of-Flight 3D data can reduce computation time of typically used computer vision algorithms, such as Viola-Jones, by up to 50%. Furthermore, we showcase a human detection demonstrator case-study implemented on an AURIX processing system that represents a state-of-the-art safety controller. By exploiting the Time-of-Flight technology's depth and amplitude data in a clever way, safety-critical human detection is enabled on the AURIX platform. Compared to competing environment perception technologies, the outlined solution is hardly achievable with structured light or stereo visioning due to the safety controller's resource constraints.
具有飞行时间环境感知的安全关键人类检测
工业4.0、工业物联网和网络物理生产系统总体上引入了高水平的自动化。因此,在车间层面,例如,人与机器之间的接口是至关重要的。如果机器人的环境感知能力不够强大和故障安全,人类可能无法被正确检测到,这可能会导致严重的后果。然而,考虑到关键应用领域中通常使用的安全控制器的资源限制,复杂的计算机视觉算法的实现通常具有挑战性。在这里,我们探索间接飞行时间环境感知技术提供的能力,以加快复杂的计算机视觉算法。特别是,我们正在研究关键和资源受限应用领域中人工检测的用例,例如工厂自动化和汽车。我们证明,基于飞行时间3D数据的预处理可以将通常使用的计算机视觉算法(如Viola-Jones)的计算时间减少多达50%。此外,我们还展示了在AURIX处理系统上实现的人类检测演示案例研究,该系统代表了最先进的安全控制器。通过巧妙地利用飞行时间(Time-of-Flight)技术的深度和振幅数据,AURIX平台实现了对安全至关重要的人工检测。与竞争的环境感知技术相比,由于安全控制器的资源限制,概述的解决方案很难通过结构光或立体视觉实现。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信