Harald Gietler, Christoph Böhm, Stefan Ainetter, Christian Schöffmann, F. Fraundorfer, S. Weiss, H. Zangl
{"title":"基于学习的林业起重机视觉抓取点预测自动化","authors":"Harald Gietler, Christoph Böhm, Stefan Ainetter, Christian Schöffmann, F. Fraundorfer, S. Weiss, H. Zangl","doi":"10.1109/SAS54819.2022.9881370","DOIUrl":null,"url":null,"abstract":"This paper presents an approach to automate the log-grasping of a forestry crane. A common hydraulic actuated log-crane is converted into a robotic device by retrofitting it with various sensors yielding perception of internal and environmental states. The approach uses a learning-based visual grasp detection. Once a suitable grasping candidate is determined, the crane starts its kinematic controlled operation. The system’s design process is based on a real-sim-real transfer to avoid possibly harmful, to humans and itself, crane behavior. Firstly, the grasping position prediction network is trained with real-world images. Secondly, an accurate simulation model of the crane, including photo-realistic synthetic images, is established. Note that in simulation, the prediction network trained on real-world data can be used without re-training. The simulation is used to design and verify the crane’s control- and the path planning scheme. In this stage, potentially dangerous maneuvers or insufficient quality of sensory information become visible. Thirdly, the elaborated closed-loop system configuration is transferred to the real-world forestry crane. The pick and place capabilities are verified in simulation as well as experimentally. A comparison shows that simulation and real-world scenarios perform equally well, validating the proposed real-sim-real design procedure.1","PeriodicalId":129732,"journal":{"name":"2022 IEEE Sensors Applications Symposium (SAS)","volume":"66 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Forestry Crane Automation using Learning-based Visual Grasping Point Prediction\",\"authors\":\"Harald Gietler, Christoph Böhm, Stefan Ainetter, Christian Schöffmann, F. Fraundorfer, S. Weiss, H. Zangl\",\"doi\":\"10.1109/SAS54819.2022.9881370\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper presents an approach to automate the log-grasping of a forestry crane. A common hydraulic actuated log-crane is converted into a robotic device by retrofitting it with various sensors yielding perception of internal and environmental states. The approach uses a learning-based visual grasp detection. Once a suitable grasping candidate is determined, the crane starts its kinematic controlled operation. The system’s design process is based on a real-sim-real transfer to avoid possibly harmful, to humans and itself, crane behavior. Firstly, the grasping position prediction network is trained with real-world images. Secondly, an accurate simulation model of the crane, including photo-realistic synthetic images, is established. Note that in simulation, the prediction network trained on real-world data can be used without re-training. The simulation is used to design and verify the crane’s control- and the path planning scheme. In this stage, potentially dangerous maneuvers or insufficient quality of sensory information become visible. Thirdly, the elaborated closed-loop system configuration is transferred to the real-world forestry crane. The pick and place capabilities are verified in simulation as well as experimentally. A comparison shows that simulation and real-world scenarios perform equally well, validating the proposed real-sim-real design procedure.1\",\"PeriodicalId\":129732,\"journal\":{\"name\":\"2022 IEEE Sensors Applications Symposium (SAS)\",\"volume\":\"66 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-08-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE Sensors Applications Symposium (SAS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SAS54819.2022.9881370\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE Sensors Applications Symposium (SAS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SAS54819.2022.9881370","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Forestry Crane Automation using Learning-based Visual Grasping Point Prediction
This paper presents an approach to automate the log-grasping of a forestry crane. A common hydraulic actuated log-crane is converted into a robotic device by retrofitting it with various sensors yielding perception of internal and environmental states. The approach uses a learning-based visual grasp detection. Once a suitable grasping candidate is determined, the crane starts its kinematic controlled operation. The system’s design process is based on a real-sim-real transfer to avoid possibly harmful, to humans and itself, crane behavior. Firstly, the grasping position prediction network is trained with real-world images. Secondly, an accurate simulation model of the crane, including photo-realistic synthetic images, is established. Note that in simulation, the prediction network trained on real-world data can be used without re-training. The simulation is used to design and verify the crane’s control- and the path planning scheme. In this stage, potentially dangerous maneuvers or insufficient quality of sensory information become visible. Thirdly, the elaborated closed-loop system configuration is transferred to the real-world forestry crane. The pick and place capabilities are verified in simulation as well as experimentally. A comparison shows that simulation and real-world scenarios perform equally well, validating the proposed real-sim-real design procedure.1