Franck Djeumou, Jonathan Y. Goh, U. Topcu, Avinash Balachandran
{"title":"Autonomous Drifting with 3 Minutes of Data via Learned Tire Models","authors":"Franck Djeumou, Jonathan Y. Goh, U. Topcu, Avinash Balachandran","doi":"10.1109/ICRA48891.2023.10161370","DOIUrl":"https://doi.org/10.1109/ICRA48891.2023.10161370","url":null,"abstract":"Near the limits of adhesion, the forces generated by a tire are nonlinear and intricately coupled. Efficient and accurate modelling in this region could improve safety, especially in emergency situations where high forces are required. To this end, we propose a novel family of tire force models based on neural ordinary differential equations and a neural-ExpTanh parameterization. These models are designed to satisfy physically insightful assumptions while also having sufficient fidelity to capture higher-order effects directly from vehicle state measurements. They are used as drop-in replacements for an analytical brush tire model in an existing nonlinear model predictive control framework. Experiments with a customized Toyota Supra show that scarce amounts of driving data – less than three minutes – is sufficient to achieve high-performance autonomous drifting on various trajectories with speeds up to 45mph. Comparisons with the benchmark model show a 4x improvement in tracking performance, smoother control inputs, and faster and more consistent computation time.","PeriodicalId":360533,"journal":{"name":"2023 IEEE International Conference on Robotics and Automation (ICRA)","volume":"42 3","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131623371","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Distributed Model Predictive Formation Control with Gait Synchronization for Multiple Quadruped Robots","authors":"Shaohang Xu, Wentao Zhang, Lijun Zhu, C. Ho","doi":"10.1109/ICRA48891.2023.10161260","DOIUrl":"https://doi.org/10.1109/ICRA48891.2023.10161260","url":null,"abstract":"In this paper, we present a fully distributed framework for multiple quadruped robots in environments with obstacles. Our approach utilizes Model Predictive Control (MPC) and multi-robot consensus protocol to obtain the distributed control law. It ensures that all the robots are able to avoid obstacles, navigate to the desired positions, and meanwhile synchronize the gaits. In particular, via MPC and consensus, the robots compute the optimal trajectory and the contact profile of the legs. Then an MPC-based locomotion controller is implemented to achieve the gait, stabilize the locomotion and track the desired trajectory. We present experiments in simulation and with three real quadruped robots in an environment with a static obstacle.","PeriodicalId":360533,"journal":{"name":"2023 IEEE International Conference on Robotics and Automation (ICRA)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132963719","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Morphological Characteristics That Enable Stable and Efficient Walking in Hexapod Robot Driven by Reflex-based Intra-limb Coordination","authors":"Wataru Sato, J. Nishii, M. Hayashibe, D. Owaki","doi":"10.1109/ICRA48891.2023.10161107","DOIUrl":"https://doi.org/10.1109/ICRA48891.2023.10161107","url":null,"abstract":"Insects exhibit adaptive walking behavior in an unstructured environment, despite having only an extremely small number of neurons (105 to 106). This suggests that not only the brain nervous system but also properties of the physical body, such as the morphological characteristics, play an essential role in generating such adaptive behavior. Our study aims at investigating the effect of body morphological characteristics on the walking performance in a robot model, which is designed to mimic an insect. To this end, we constructed an insect-like hexapod model in a simulation environment that implements a reflex-based intra-limb coordination control. Herein, for a set of walking parameters, which were optimized to maximize the energy efficiency at the target speed, we investigated the effects of changes in the standard posture of the two leg joints on the walking success rate for various initial conditions and cost of transport (CoT) as an index of energy efficiency. Simulation results indicated that robots with specific morphological characteristics similar to those of insects exhibited high gait stability and energetic efficiency. Because only the reflex-based control was employed, the inter-leg coordination occurred spontaneously, suggesting that our approach would lead to a useful design methodology from the perspective of computational cost in generating the walking locomotion.","PeriodicalId":360533,"journal":{"name":"2023 IEEE International Conference on Robotics and Automation (ICRA)","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127637528","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Wenqi Zhang, Yuan Yao, Xincheng Liu, Kai-chang Kou, Gang Yang
{"title":"EFTrack: A Lightweight Siamese Network for Aerial Object Tracking","authors":"Wenqi Zhang, Yuan Yao, Xincheng Liu, Kai-chang Kou, Gang Yang","doi":"10.1109/ICRA48891.2023.10160685","DOIUrl":"https://doi.org/10.1109/ICRA48891.2023.10160685","url":null,"abstract":"Visual object tracking is a very important task for unmanned aerial vehicle (UAV). Limited resources of UAV lead to strong demand for efficient and robust trackers. In recent years, deep learning-based trackers, especially, siamese trackers achieve very impressive results. Though siamese trackers can run a relatively fast speed on the high-end GPU, they are becoming heavier and heavier which restricts them to be deployed on UAV platform. In this work, we propose a lightweight aerial tracker based on the siamese network. We use EfficientNet as the backbone, which has less parameters and stronger feature extract ability compared with ResNet-50. After a pixel-wise correlation, a classification branch and a regression branch are applied to predict the front/back score and offset of the target without the predefined anchor. The results show that our tracker works efficiently and achieves impressive performance on UAV tracking datasets. In addition, the real-world test shows that it runs effectively on the Nvidia Jetson NX deployed on DJI UAV.","PeriodicalId":360533,"journal":{"name":"2023 IEEE International Conference on Robotics and Automation (ICRA)","volume":"137 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131236478","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Zeidan, Zhouyang Xu, Christopher E. Mower, Honglei Wu, Quentin Walker, Oyinkansola Ayoade, Natalia Cotic, J. Behar, Steven E. Williams, A. Arujuna, Y. Noh, R. Housden, K. Rhode
{"title":"Design and Development of a Novel Force-Sensing Robotic System for the Transseptal Puncture in Left Atrial Catheter Ablation","authors":"A. Zeidan, Zhouyang Xu, Christopher E. Mower, Honglei Wu, Quentin Walker, Oyinkansola Ayoade, Natalia Cotic, J. Behar, Steven E. Williams, A. Arujuna, Y. Noh, R. Housden, K. Rhode","doi":"10.1109/ICRA48891.2023.10160254","DOIUrl":"https://doi.org/10.1109/ICRA48891.2023.10160254","url":null,"abstract":"Transseptal puncture (TSP) is a prerequisite for left atrial catheter ablation for atrial fibrillation, requiring access from the right side of the heart. It is a demanding procedural step associated with complications, including inadvertent puncturing and application of large forces on the tissue wall. Robotic systems have shown great potential to overcome such challenges by introducing force-sensing capabilities and increased precision and localization accuracy. Therefore, this work introduces the design and development of a novel robotic system developed to perform TSP. We integrated optoelectronic sensors into the tools' fixtures, measuring tissue contact and puncture forces along one axis. The novelty of this design is in the system's ability to manipulate a Brockenbrough (BRK) needle and dilator-sheath simultaneously and measure tissue contact and puncture forces. In performing puncture experiments on anthropomorphic tissue models, an average puncture force of 3.97 ± 0.45 N (1SD) was established - similar to the force reported in literature on the manual procedure. This research highlights the potential for improving patient safety by enforcing force constraints, paving the way to more automated and safer TSP.","PeriodicalId":360533,"journal":{"name":"2023 IEEE International Conference on Robotics and Automation (ICRA)","volume":"117 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132996675","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Mihai Dragusanu, Danilo Troisi, D. Prattichizzo, M. Malvezzi
{"title":"Compliant Finger Joint with Controlled Variable Stiffness based on Twisted Strings Actuation","authors":"Mihai Dragusanu, Danilo Troisi, D. Prattichizzo, M. Malvezzi","doi":"10.1109/ICRA48891.2023.10160353","DOIUrl":"https://doi.org/10.1109/ICRA48891.2023.10160353","url":null,"abstract":"Underactuated tendon-driven fingers are a simple, yet effective solution, for realizing robotic grippers and hands. The lack of controllable degrees of actuation and precise sensing is compensated by the deformable structure of the finger, which is able to adapt to the objects to be grasped and manipulated, and also to implement grasping strategies based on environmental constraint exploitation. One of the main drawbacks of these robotic fingers is that, due to the limited number of actuators, they can only realize a limited number of movements. Finger closure motion realized by activating the tendon depends on finger mechanical properties, and in particular on elastic joint stiffness. In this paper, we introduce a passive elastic joint to be implemented in monolithic fingers in which the stiffness can be actively regulated by applying a pre-compression to the structure, controlled by a twisted-string actuator (TSA). The paper describes the working principle of the joint, investigates the relationship between pre-compression and flexural stiffness, and finally shows its application to a robotic finger composed of three phalanges.","PeriodicalId":360533,"journal":{"name":"2023 IEEE International Conference on Robotics and Automation (ICRA)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132996719","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Heading Control of a Long-Endurance Insect-Scale Aerial Robot Powered by Soft Artificial Muscles","authors":"Y. Hsiao, Suhan Kim, Zhijian Ren, Yufeng Chen","doi":"10.1109/ICRA48891.2023.10161547","DOIUrl":"https://doi.org/10.1109/ICRA48891.2023.10161547","url":null,"abstract":"Aerial insects demonstrate fast and precise heading control when they perform body saccades and rapid escape maneuvers. While insect-scale micro-aerial-vehicles (IMAVs) have demonstrated early results on heading control, their flight endurance and heading angle tracking accuracy remain far inferior to that of natural fliers. In this work, we present a long endurance sub-gram aerial robot that can demonstrate effective heading control during hovering flight. Through using a tilted wing stroke-plane design, our robot demonstrates a 10-second flight where it tracks a desired yaw trajectory with maximum and root-mean-square (RMS) error of $boldsymbol{14.2^{circ}}$ and $boldsymbol{5.8}^{mathrm{o}}$. The new robot design requires 7% higher lift forces for enabling heading angle control, which creates higher stress on wing hinges and adversely influences robot endurance. To address this challenge, we developed novel 3-layered wing hinges that exhibit 1.82 times improvement of lifetime. With the new wing hinges, our robot demonstrates a 40-second hovering flight - the longest among existing sub-gram IMAVs. These results represent substantial improvement of flight capabilities in soft-actuated IMAVs, showing the potential of operating these insect-like fliers in cluttered natural environments.","PeriodicalId":360533,"journal":{"name":"2023 IEEE International Conference on Robotics and Automation (ICRA)","volume":"80 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133510388","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Xiao Gu, Jinpei Han, Guang-zhong Yang, Benny P. L. Lo
{"title":"Generalizable Movement Intention Recognition with Multiple Heterogeneous EEG Datasets","authors":"Xiao Gu, Jinpei Han, Guang-zhong Yang, Benny P. L. Lo","doi":"10.1109/ICRA48891.2023.10160462","DOIUrl":"https://doi.org/10.1109/ICRA48891.2023.10160462","url":null,"abstract":"Human movement intention recognition is important for human-robot interaction. Existing work based on motor imagery electroencephalogram (EEG) provides a non-invasive and portable solution for intention detection. However, the data-driven methods may suffer from the limited scale and diversity of the training datasets, which result in poor generalization performance on new test subjects. It is practically difficult to directly aggregate data from multiple datasets for training, since they often employ different channels and collected data suffers from significant domain shifts caused by different devices, experiment setup, etc. On the other hand, the inter-subject heterogeneity is also substantial due to individual differences in EEG representations. In this work, we developed two networks to learn from both the shared and the complete channels across datasets, handling inter-subject and inter-dataset heterogeneity respectively. Based on both networks, we further developed an online knowledge co-distillation framework to collaboratively learn from both networks, achieving coherent performance boosts. Experimental results have shown that our proposed method can effectively aggregate knowledge from multiple datasets, demonstrating better generalization in the context of cross-subject validation.","PeriodicalId":360533,"journal":{"name":"2023 IEEE International Conference on Robotics and Automation (ICRA)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133416815","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Chenjun Gao, Yanzhi Dong, Xiaohu Yuan, Yifei Han, Huaping Liu
{"title":"Infrared Image Captioning with Wearable Device","authors":"Chenjun Gao, Yanzhi Dong, Xiaohu Yuan, Yifei Han, Huaping Liu","doi":"10.1109/ICRA48891.2023.10160809","DOIUrl":"https://doi.org/10.1109/ICRA48891.2023.10160809","url":null,"abstract":"Wearable devices have garnered widespread attention as a mobile solution, and various intelligent modules based on wearable devices are increasingly being integrated. Additionally, image captioning is an important task in computer vision that maps images to text. Existing image captioning achievements are based on high-quality visible images. However, higher target complexity and insufficient light can lead to reduced captioning performance and mistakes. In this paper, we present an infrared image captioning framework designed to solve the problem of invalid visible image captioning in special conditions. Remarkably, we integrate the infrared image captioning model into the wearable device. Volunteers perform offline and real-time environmental analysis tasks in the real world to evaluate the framework's effectiveness in multiple scenarios. The results indicate that both the accuracy of infrared image captioning and the feedback from wearable device users are promising.","PeriodicalId":360533,"journal":{"name":"2023 IEEE International Conference on Robotics and Automation (ICRA)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128878384","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"EMS®: A Massive Computational Experiment Management System towards Data-driven Robotics","authors":"Qinjie Lin, Guo Ye, Han Liu","doi":"10.1109/ICRA48891.2023.10160948","DOIUrl":"https://doi.org/10.1109/ICRA48891.2023.10160948","url":null,"abstract":"We propose EMS®, a cloud-enabled massive computational experiment management system supporting high-throughput computational robotics research. Compared to existing systems, EMS® features a sky-based pipeline orchestrator which allows us to exploit heterogeneous computing environments painlessly (e.g., on-premise clusters, public clouds, edge devices) to optimally deploy large-scale computational jobs (e.g., with more than millions of computational hours) in an integrated fashion. Cornerstoned on this sky-based pipeline orchestrator, this paper introduces three abstraction layers of the EMS® software architecture: (i) Configuration management layer focusing on automatically enumerating experimental configurations; (ii) Dependency management layer focusing on managing the complex task dependencies within each experimental configuration; (iii) Computation management layer focusing on optimally executing the computational tasks using the given computing resource. Such an architectural design greatly increases the scalability and reproducibility of data-driven robotics research leading to much-improved productivity. To demonstrate this point, we compare EMS® with more traditional approaches on an offline reinforcement learning problem for training mobile robots. Our results show that EMS® outperforms more traditional approaches in two magnitudes of orders (in terms of experimental high throughput and cost) with only several lines of code change. We also exploit EMS® to develop mobile robot, robot arm, and bipedal applications, demonstrating its applicability to numerous robot applications.","PeriodicalId":360533,"journal":{"name":"2023 IEEE International Conference on Robotics and Automation (ICRA)","volume":"65 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115481400","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}