N. Druml, Bernhard Rutte-Vas, Sandra Wilfling, C. Consani, M. Baumgart, T. Herndl, G. Holweg
{"title":"具有飞行时间环境感知的安全关键人类检测","authors":"N. Druml, Bernhard Rutte-Vas, Sandra Wilfling, C. Consani, M. Baumgart, T. Herndl, G. Holweg","doi":"10.1109/ETFA.2017.8247647","DOIUrl":null,"url":null,"abstract":"Industrie 4.0, Industrial IoT, and cyber-physical production systems in general introduce high levels of automation. Hence, at shop floor level, for example, the interfaces between human and machines are crucial. If a robot's environment perception is not robust and fail-safe, humans may not be detected properly, which may cause critical consequences. However, given the resource constraints of safety controllers typically used in critical application domains, the implementation of complex computer vision algorithms is often challenging. Here we explore the capabilities the indirect Time-of-Flight environment perception technology provides in order to speed up complex computer vision algorithms. In particular we are investigating the use-case of human detection in the domains of critical and resource-constrained applications, such as factory automation and automotive. We demonstrate that preprocessing based on Time-of-Flight 3D data can reduce computation time of typically used computer vision algorithms, such as Viola-Jones, by up to 50%. Furthermore, we showcase a human detection demonstrator case-study implemented on an AURIX processing system that represents a state-of-the-art safety controller. By exploiting the Time-of-Flight technology's depth and amplitude data in a clever way, safety-critical human detection is enabled on the AURIX platform. Compared to competing environment perception technologies, the outlined solution is hardly achievable with structured light or stereo visioning due to the safety controller's resource constraints.","PeriodicalId":6522,"journal":{"name":"2017 22nd IEEE International Conference on Emerging Technologies and Factory Automation (ETFA)","volume":"93 1","pages":"1-7"},"PeriodicalIF":0.0000,"publicationDate":"2017-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Safety-critical human detection featuring time-of-flight environment perception\",\"authors\":\"N. Druml, Bernhard Rutte-Vas, Sandra Wilfling, C. Consani, M. Baumgart, T. Herndl, G. Holweg\",\"doi\":\"10.1109/ETFA.2017.8247647\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Industrie 4.0, Industrial IoT, and cyber-physical production systems in general introduce high levels of automation. Hence, at shop floor level, for example, the interfaces between human and machines are crucial. If a robot's environment perception is not robust and fail-safe, humans may not be detected properly, which may cause critical consequences. However, given the resource constraints of safety controllers typically used in critical application domains, the implementation of complex computer vision algorithms is often challenging. Here we explore the capabilities the indirect Time-of-Flight environment perception technology provides in order to speed up complex computer vision algorithms. In particular we are investigating the use-case of human detection in the domains of critical and resource-constrained applications, such as factory automation and automotive. We demonstrate that preprocessing based on Time-of-Flight 3D data can reduce computation time of typically used computer vision algorithms, such as Viola-Jones, by up to 50%. Furthermore, we showcase a human detection demonstrator case-study implemented on an AURIX processing system that represents a state-of-the-art safety controller. By exploiting the Time-of-Flight technology's depth and amplitude data in a clever way, safety-critical human detection is enabled on the AURIX platform. Compared to competing environment perception technologies, the outlined solution is hardly achievable with structured light or stereo visioning due to the safety controller's resource constraints.\",\"PeriodicalId\":6522,\"journal\":{\"name\":\"2017 22nd IEEE International Conference on Emerging Technologies and Factory Automation (ETFA)\",\"volume\":\"93 1\",\"pages\":\"1-7\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 22nd IEEE International Conference on Emerging Technologies and Factory Automation (ETFA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ETFA.2017.8247647\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 22nd IEEE International Conference on Emerging Technologies and Factory Automation (ETFA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ETFA.2017.8247647","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Safety-critical human detection featuring time-of-flight environment perception
Industrie 4.0, Industrial IoT, and cyber-physical production systems in general introduce high levels of automation. Hence, at shop floor level, for example, the interfaces between human and machines are crucial. If a robot's environment perception is not robust and fail-safe, humans may not be detected properly, which may cause critical consequences. However, given the resource constraints of safety controllers typically used in critical application domains, the implementation of complex computer vision algorithms is often challenging. Here we explore the capabilities the indirect Time-of-Flight environment perception technology provides in order to speed up complex computer vision algorithms. In particular we are investigating the use-case of human detection in the domains of critical and resource-constrained applications, such as factory automation and automotive. We demonstrate that preprocessing based on Time-of-Flight 3D data can reduce computation time of typically used computer vision algorithms, such as Viola-Jones, by up to 50%. Furthermore, we showcase a human detection demonstrator case-study implemented on an AURIX processing system that represents a state-of-the-art safety controller. By exploiting the Time-of-Flight technology's depth and amplitude data in a clever way, safety-critical human detection is enabled on the AURIX platform. Compared to competing environment perception technologies, the outlined solution is hardly achievable with structured light or stereo visioning due to the safety controller's resource constraints.