2023 IEEE International Conference on Robotics and Automation (ICRA)最新文献

筛选
英文 中文
Analytical Approach to Inverse Kinematics of Single Section Mobile Continuum Manipulators 单节移动连续体机械臂逆运动学分析方法
2023 IEEE International Conference on Robotics and Automation (ICRA) Pub Date : 2023-05-29 DOI: 10.1109/ICRA48891.2023.10160825
Audrey Hyacinthe Bouyom Boutchouang, A. Melingui, J. M. Mvogo Ahanda, Xinrui Yang, Othman Lakhal, F. Biya Motto, R. Merzouki
{"title":"Analytical Approach to Inverse Kinematics of Single Section Mobile Continuum Manipulators","authors":"Audrey Hyacinthe Bouyom Boutchouang, A. Melingui, J. M. Mvogo Ahanda, Xinrui Yang, Othman Lakhal, F. Biya Motto, R. Merzouki","doi":"10.1109/ICRA48891.2023.10160825","DOIUrl":"https://doi.org/10.1109/ICRA48891.2023.10160825","url":null,"abstract":"This paper proposes a novel mathematical solution to solve the inverse kinematics (IK) of single section mobile continuum manipulators (SSMCMs). Thus, to achieve a given pose of the end-effector (EE), the proposed mathematical solution consists in determining the position and orientation parameters of the mobile platform and of a single section of the continuum manipulator. As advantages, the proposed mathematical solution eliminates the EE pose errors when the dynamic parameters are neglected and the continuum manipulator is cylindrical in shape. A simulation and an experiment validate the proposed approach.","PeriodicalId":360533,"journal":{"name":"2023 IEEE International Conference on Robotics and Automation (ICRA)","volume":"48 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125419581","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Fusion of Events and Frames using 8-DOF Warping Model for Robust Feature Tracking 基于8自由度扭曲模型的事件与帧融合鲁棒特征跟踪
2023 IEEE International Conference on Robotics and Automation (ICRA) Pub Date : 2023-05-29 DOI: 10.1109/ICRA48891.2023.10161098
Min-Seok Lee, Yejun Kim, J. Jung, Chan Gook Park
{"title":"Fusion of Events and Frames using 8-DOF Warping Model for Robust Feature Tracking","authors":"Min-Seok Lee, Yejun Kim, J. Jung, Chan Gook Park","doi":"10.1109/ICRA48891.2023.10161098","DOIUrl":"https://doi.org/10.1109/ICRA48891.2023.10161098","url":null,"abstract":"Event cameras are asynchronous neuromorphic vision sensors with high temporal resolution and no motion blur, offering advantages over standard frame-based cameras especially in high-speed motions and high dynamic range conditions. However, event cameras are unable to capture the overall context of the scene, and produce different events for the same scenery depending on the direction of the motion, creating a challenge in data association. Standard camera, on the other hand, provides frames at a fixed rate that are independent of the motion direction, and are rich in context. In this paper, we present a robust feature tracking method that employs 8-DOF warping model in minimizing the difference between brightness increment patches from events and frames, exploiting the complementary nature of the two data types. Unlike previous works, the proposed method enables tracking of features under complex motions accompanying distortions. Extensive quantitative evaluation over publicly available datasets was performed where our method shows an improvement over state-of-the-art methods in robustness with greatly prolonged feature age and in accuracy for challenging scenarios.","PeriodicalId":360533,"journal":{"name":"2023 IEEE International Conference on Robotics and Automation (ICRA)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123020307","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
SRI-Graph: A Novel Scene-Robot Interaction Graph for Robust Scene Understanding 基于鲁棒场景理解的新型场景-机器人交互图
2023 IEEE International Conference on Robotics and Automation (ICRA) Pub Date : 2023-05-29 DOI: 10.1109/ICRA48891.2023.10161085
D. Yang, Xiao Xu, Mengchen Xiong, Edwin Babaians, E. Steinbach
{"title":"SRI-Graph: A Novel Scene-Robot Interaction Graph for Robust Scene Understanding","authors":"D. Yang, Xiao Xu, Mengchen Xiong, Edwin Babaians, E. Steinbach","doi":"10.1109/ICRA48891.2023.10161085","DOIUrl":"https://doi.org/10.1109/ICRA48891.2023.10161085","url":null,"abstract":"We propose a novel scene-robot interaction graph (SRI-Graph) that exploits the known position of a mobile manipulator for robust and accurate scene understanding. Compared to the state-of-the-art scene graph approaches, the proposed SRI-Graph captures not only the relationships between the objects, but also the relationships between the robot manipulator and objects with which it interacts. To improve the detection accuracy of spatial relationships, we leverage the 3D position of the mobile manipulator in addition to RGB images. The manipulator's ego information is crucial for a successful scene understanding when the relationships are visually uncertain. The proposed model is validated for a real-world 3D robot-assisted feeding task. We release a new dataset named 3DRF-Pos for training and validation. We also develop a tool, named LabelImg-Rel, as an extension of the open-sourced image annotation tool LabelImg for a convenient annotation in robot-environment interaction scenarios*. Our experimental results using the Movo platform show that SRI-Graph outperforms the state-of-the-art approach and improves detection accuracy by up to 9.83%.","PeriodicalId":360533,"journal":{"name":"2023 IEEE International Conference on Robotics and Automation (ICRA)","volume":"95 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123034544","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Balancing Efficiency and Unpredictability in Multi-robot Patrolling: A MARL-Based Approach 多机器人巡逻中效率与不可预测性的平衡:基于marl的方法
2023 IEEE International Conference on Robotics and Automation (ICRA) Pub Date : 2023-05-29 DOI: 10.1109/ICRA48891.2023.10160923
Lingxiao Guo, Haoxuan Pan, Xiaoming Duan, Jianping He
{"title":"Balancing Efficiency and Unpredictability in Multi-robot Patrolling: A MARL-Based Approach","authors":"Lingxiao Guo, Haoxuan Pan, Xiaoming Duan, Jianping He","doi":"10.1109/ICRA48891.2023.10160923","DOIUrl":"https://doi.org/10.1109/ICRA48891.2023.10160923","url":null,"abstract":"Patrolling with multiple robots is a challenging task. While the robots collaboratively and repeatedly cover the regions of interest in the environment, their routes should satisfy two often conflicting properties: i) (efficiency) the time intervals between two consecutive visits to the regions are small; ii) (unpredictability) the patrolling trajectories are random and unpredictable. We manage to strike a balance between the two goals by i) recasting the original patrolling problem as a Graph Deep Learning problem; ii) directly solving this problem on the graph in the framework of cooperative multi-agent reinforcement learning. Treating the decisions of a team of agents as a sequence input, our model outputs the agents' actions in order by an autoregressive mechanism. Extensive simulation studies show that our approach has comparable performance with existing algorithms in terms of efficiency and outperforms them in terms of unpredictability. To our knowledge, this is the first work that successfully solves the patrolling problem with reinforcement learning on a graph.","PeriodicalId":360533,"journal":{"name":"2023 IEEE International Conference on Robotics and Automation (ICRA)","volume":"50 12","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120915904","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Toward Cooperative 3D Object Reconstruction with Multi-agent 基于多智能体的三维物体协同重建研究
2023 IEEE International Conference on Robotics and Automation (ICRA) Pub Date : 2023-05-29 DOI: 10.1109/ICRA48891.2023.10160714
Xiong Li, Zhenyu Wen, Leiqiang Zhou, Chenwei Li, Yejian Zhou, Taotao Li, Zhen Hong
{"title":"Toward Cooperative 3D Object Reconstruction with Multi-agent","authors":"Xiong Li, Zhenyu Wen, Leiqiang Zhou, Chenwei Li, Yejian Zhou, Taotao Li, Zhen Hong","doi":"10.1109/ICRA48891.2023.10160714","DOIUrl":"https://doi.org/10.1109/ICRA48891.2023.10160714","url":null,"abstract":"We study the problem of object reconstruction in a multi-agent collaboration scenario. Specifically, we focus on the reconstruction of specific goals through several cooperative agents equipped with vision sensors to achieve higher efficiency than single agents. Our main insight is that a complete 3D object can be split into several local 3D models and assigned to different agents. In addition, we can use the salient characteristics of the collaboration agent itself to help realize the integration of local models. We develop a novel pipeline that first restores local 3D models from the images obtained from different agents, then the relative poses between collaborative agents are estimated by aligning intrinsic features. After that, all local models are integrated using the estimated parameters. Extensive experiments show that our proposed method is capable of accurately reconstructing 3D objects in the real world in a multi-agent collaborative manner. The full reconstruction pipeline is released to the public as an open-source project.","PeriodicalId":360533,"journal":{"name":"2023 IEEE International Conference on Robotics and Automation (ICRA)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116216684","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Humans Need Augmented Feedback to Physically Track Non-Biological Robot Movements 人类需要增强反馈来物理跟踪非生物机器人的运动
2023 IEEE International Conference on Robotics and Automation (ICRA) Pub Date : 2023-05-29 DOI: 10.1109/ICRA48891.2023.10161075
Mahdiar Edraki, P. Maurice, D. Sternad
{"title":"Humans Need Augmented Feedback to Physically Track Non-Biological Robot Movements","authors":"Mahdiar Edraki, P. Maurice, D. Sternad","doi":"10.1109/ICRA48891.2023.10161075","DOIUrl":"https://doi.org/10.1109/ICRA48891.2023.10161075","url":null,"abstract":"An important component for the effective collaboration of humans with robots is the compatibility of their movements, especially when humans physically collaborate with a robot partner. Following previous findings that humans interact more seamlessly with a robot that moves with human-like or biological velocity profiles, this study examined whether humans can adapt to a robot that violates human signatures. The specific focus was on the role of extensive practice and real-time augmented feedback. Six groups of participants physically tracked a robot tracing an ellipse with profiles where velocity scaled with the curvature of the path in biological and non-biological ways, while instructed to minimize the interaction force with the robot. Three of the 6 groups received real-time visual feedback about their force error. Results showed that with 3 daily practice sessions, when given feedback about their force errors, humans could decrease their interaction forces when the robot's trajectory violated human-like velocity patterns. Conversely, when augmented feedback was not provided, there were no improvements despite this extensive practice. The biological profile showed no improvements, even with feedback, indicating that the (non-zero) force had already reached a floor level. These findings highlight the importance of biological robot trajectories and augmented feedback to guide humans to adapt to non-biological movements in physical human-robot interaction. These results have implications on various fields of robotics, such as surgical applications and collaborative robots for industry.","PeriodicalId":360533,"journal":{"name":"2023 IEEE International Conference on Robotics and Automation (ICRA)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116615176","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Perturbation-Based Best Arm Identification for Efficient Task Planning with Monte-Carlo Tree Search 基于扰动的蒙特卡罗树搜索任务规划最佳臂辨识
2023 IEEE International Conference on Robotics and Automation (ICRA) Pub Date : 2023-05-29 DOI: 10.1109/ICRA48891.2023.10161169
Daejong Jin, Juhan Park, Kyungjae Lee
{"title":"Perturbation-Based Best Arm Identification for Efficient Task Planning with Monte-Carlo Tree Search","authors":"Daejong Jin, Juhan Park, Kyungjae Lee","doi":"10.1109/ICRA48891.2023.10161169","DOIUrl":"https://doi.org/10.1109/ICRA48891.2023.10161169","url":null,"abstract":"Combining task and motion planning (TAMP) is crucial for intelligent robots to perform complex and long-horizon tasks. In TAMP, many approaches generally employ Monte-Carlo tree search (MCTS) with upper confidence bound (UCB) for task planning to handle exploration-exploitation trade-off and find globally optimal solutions. However, since UCB basically considers the estimation error caused by noise, the error caused by insufficient optimization of the sub-tree is not represented. Hence, UCB-based approaches have the disadvantage of not exploring underestimated sub-trees. To alleviate this issue, we propose a novel tree search method using perturbation-based best-arm identification (PBAI). We theoretically prove the bound of the simple regret of our method and empirically verify that PBAI finds the optimal task plans faster and more efficiently than the existing algorithms. The source code of our proposed algorithm is available at https://github.com/jdj2261/pytamp.","PeriodicalId":360533,"journal":{"name":"2023 IEEE International Conference on Robotics and Automation (ICRA)","volume":"87 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116748392","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Decision diagrams as plans: Answering observation-grounded queries 作为计划的决策图:回答基于观察的查询
2023 IEEE International Conference on Robotics and Automation (ICRA) Pub Date : 2023-05-29 DOI: 10.1109/ICRA48891.2023.10161530
Dylan A. Shell, J. O’Kane
{"title":"Decision diagrams as plans: Answering observation-grounded queries","authors":"Dylan A. Shell, J. O’Kane","doi":"10.1109/ICRA48891.2023.10161530","DOIUrl":"https://doi.org/10.1109/ICRA48891.2023.10161530","url":null,"abstract":"We consider a robot that answers questions about its environment by traveling to appropriate places and then sensing. Questions are posed as structured queries and may involve conditional or contingent relationships between observable properties. After formulating this problem, and empha-sizing the advantages of exploiting deducible information, we describe how non-trivial knowledge of the world and queries can be given a convenient, concise, unified representation via reduced ordered binary decision diagrams (BDDs). To use these data structures directly for inference and planning, we introduce a new product operation, and generalize the classic dynamic variable reordering techniques to solve planning problems. Also, finally, we evaluate optimizations that exploit locality.","PeriodicalId":360533,"journal":{"name":"2023 IEEE International Conference on Robotics and Automation (ICRA)","volume":"91 2","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"113938109","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Skill-based Robot Programming in Mixed Reality with Ad-hoc Validation Using a Force-enabled Digital Twin 混合现实中基于技能的机器人编程,使用力支持的数字双胞胎进行临时验证
2023 IEEE International Conference on Robotics and Automation (ICRA) Pub Date : 2023-05-29 DOI: 10.1109/ICRA48891.2023.10161095
J. Krieglstein, Gesche Held, B. A. Bálint, Frank Nägele, Werner Kraus
{"title":"Skill-based Robot Programming in Mixed Reality with Ad-hoc Validation Using a Force-enabled Digital Twin","authors":"J. Krieglstein, Gesche Held, B. A. Bálint, Frank Nägele, Werner Kraus","doi":"10.1109/ICRA48891.2023.10161095","DOIUrl":"https://doi.org/10.1109/ICRA48891.2023.10161095","url":null,"abstract":"Skill-based programming has proven to be advantageous for assembly tasks, but still requires expert knowledge, especially for force-controlled applications. However, it is error-prone due to the multitude of parameters, e.g. different coordinate frames and either position-, velocity- or force-controlled motions on the axes of a frame. We propose a mixed reality based solution, which systematically visualizes the geometric constraints of advanced high-level skills directly in the real-world robotic environment and provides a user interface to create applications efficiently and safely in mixed reality. Therefore, state-machine information is also visualized, and a holographic digital twin allows the user to ad-hoc validate the program via force-enabled simulation. The approach is evaluated on a top hat rail mounting task, proving the capability of the system to handle advanced assembly programming tasks efficiently and tangibly.","PeriodicalId":360533,"journal":{"name":"2023 IEEE International Conference on Robotics and Automation (ICRA)","volume":"59 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121831021","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
StereoVAE: A lightweight stereo-matching system using embedded GPUs StereoVAE:使用嵌入式gpu的轻量级立体匹配系统
2023 IEEE International Conference on Robotics and Automation (ICRA) Pub Date : 2023-05-29 DOI: 10.1109/ICRA48891.2023.10160441
Qiong Chang, Xiang Li, Xin Xu, Xin Liu, Yun Li, Jun Miyazaki
{"title":"StereoVAE: A lightweight stereo-matching system using embedded GPUs","authors":"Qiong Chang, Xiang Li, Xin Xu, Xin Liu, Yun Li, Jun Miyazaki","doi":"10.1109/ICRA48891.2023.10160441","DOIUrl":"https://doi.org/10.1109/ICRA48891.2023.10160441","url":null,"abstract":"We propose a lightweight system for stereo-matching using embedded graphic processing units (GPUs). The proposed system overcomes the trade-off between accuracy and processing speed in stereo matching, thus further improving the matching accuracy while ensuring real-time processing. The basic idea is to construct a tiny neural network based on a variational autoencoder (VAE) to achieve the upscaling and refinement a small size of coarse disparity map. This map is initially generated using a traditional matching method. The proposed hybrid structure maintains the advantage of low computational complexity found in traditional methods. Additionally, it achieves matching accuracy with the help of a neural network. Extensive experiments on the KITTI 2015 benchmark dataset demonstrate that our tiny system exhibits high robustness in improving the accuracy of coarse disparity maps generated by different algorithms, while running in real-time on embedded GPUs.","PeriodicalId":360533,"journal":{"name":"2023 IEEE International Conference on Robotics and Automation (ICRA)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123798105","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信