{"title":"Vision-Based Robot Arm Control Interface for Retrieving Objects from the Floor","authors":"Laijun Yang, Ryota Sakamoto, N. Kato, K. Yano","doi":"10.20965/jrm.2023.p0501","DOIUrl":"https://doi.org/10.20965/jrm.2023.p0501","url":null,"abstract":"Approximately half of the patients with spinal cord injuries in Japan have a cervical spinal cord injury. Owing to the trunk dysfunction, patients with high-level spinal cord injuries have particular difficulty when searching for or picking up objects from the floor. Recently, welfare robot arms have been developed to help such individuals increase self-reliance. In this study, we propose an operating system that includes an eye-in-hand system with a touchscreen interface for grasping objects from the floor and delivering them to the individual. In the proposed method, the visual information of the target object is shown on a touchscreen interface. The patient specifies the target position for the robot arm by drawing a line on the target object on the interface. We conducted an experiment to compare the proposed interface with an on-screen joystick to demonstrate the proposed system’s efficiency and its ability to reduce physical burden. The results show that the proposed method is both quicker to use and effectively reduces the physical burden on the user compared to the conventional method.","PeriodicalId":178614,"journal":{"name":"J. Robotics Mechatronics","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125002038","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Application and Mechanical Evaluation of Polyarylate Fiber Rope in Wire Drive Mechanism of Robotic Surgical Instruments","authors":"Kanta Nojima, Kotaro Tadano, Daisuke Haraguchi","doi":"10.20965/jrm.2023.p0461","DOIUrl":"https://doi.org/10.20965/jrm.2023.p0461","url":null,"abstract":"In this study, a polyarylate fiber rope, which is a high-strength synthetic fiber rope, is used in the wire drive mechanism of a multi-degree of freedom (DOF) robotic forceps to evaluate its mechanical practicability. Using a nonconducting material for the drive wire, different from typical use of metallic wires made of stainless steel and tungsten, a technology is developed to simplify the insulation structure significantly, decrease the diameter of the robotic surgical instrument, and lower its cost. In this study, first, a prototype of the multi-DOF robotic forceps equipped with a polyetheretherketone (PEEK) resin flexible wrist joint part with an external diameter of 5 mm is manufactured. The prototype is used to evaluate the assembling of a polyarylate fiber rope with a diameter of 0.34 mm in a multi-DOF mechanism and examine the endurance of the rope to mechanical motions under a single-use assumption. As fastening structures to assemble the rope – a crimp terminal using a hollow pipe and a thread knot – are examined individually by assembling them in the prototype robotic forceps and conducting strength tests of the tension generated by the drive. The test results show that the thread knot method exerts a stabler fastening strength than the hollow pipe method. However, a problem of the former is that the wire may break because of its strong contact with the edge of the hole of the wire guide. Subsequently, to evaluate the endurance of the rope to single-use operation motion, operation tests are conducted by implementing reciprocating bending motions of the flexible wrist joint part of the robotic forceps 1,000 times. The assembled rope endures the sliding within the flexible wrist joint part and the contact loading with the guide part and the fixed structure within the cartridge repeatedly. The endurance operation test results confirm that the drive transmission of the polyarylate fiber rope has sufficient mechanical endurance to 1,000 reciprocating bending motions of the PEEK flexible wrist joint part.","PeriodicalId":178614,"journal":{"name":"J. Robotics Mechatronics","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120974499","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Koki Aoki, Tomoya Sato, E. Takeuchi, Yoshiki Ninomiya, J. Meguro
{"title":"Error Covariance Estimation of 3D Point Cloud Registration Considering Surrounding Environment","authors":"Koki Aoki, Tomoya Sato, E. Takeuchi, Yoshiki Ninomiya, J. Meguro","doi":"10.20965/jrm.2023.p0435","DOIUrl":"https://doi.org/10.20965/jrm.2023.p0435","url":null,"abstract":"To realize autonomous vehicle safety, it is important to accurately estimate the vehicle’s pose. As one of the localization techniques, 3D point cloud registration is commonly used. However, pose errors are likely to occur when there are few features in the surrounding environment. Although many studies have been conducted on estimating error distribution of 3D point cloud registration, the real environment is not reflected. This paper presents real-time error covariance estimation in 3D point cloud registration according to the surrounding environment. The proposed method provides multiple initial poses for iterative optimization in the registration method. Using converged poses in multiple searches, the error covariance reflecting the real environment is obtained. However, the initial poses were limited to directions in which the pose error was likely to occur. Hence, the limited search efficiently determined local optima of the registration. In addition, the process was conducted within 10 Hz, which is laser imaging detection and ranging (LiDAR) period; however, the execution time exceeded 100 ms in some places. Therefore, further improvement is necessary.","PeriodicalId":178614,"journal":{"name":"J. Robotics Mechatronics","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117236719","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Satoshi Ito, R. Kaneko, Takumi Saito, Yuji Nakamura
{"title":"A Map Creation for LiDAR Localization Based on the Design Drawings and Tablet Scan Data","authors":"Satoshi Ito, R. Kaneko, Takumi Saito, Yuji Nakamura","doi":"10.20965/jrm.2023.p0470","DOIUrl":"https://doi.org/10.20965/jrm.2023.p0470","url":null,"abstract":"This paper proposes a method for the point cloud data (PCD) map creation for the 3D LiDAR localization. The features of the method include the creation of a PCD map from a drawing of the buildings and partial scan of the not-existing object of the map by the tablet computer with the LiDAR. In the former, a map creation procedure, including the up- and down-sampling, as well as the processing, with voxel grid filter is established. In the latter, automatic position correction of the tablet scan data is introduced when they are placed to the current PCD map. Experiments are conducted to determine the size of the voxel grid filter and prove the effect of the tablet scan data in enhancing the matching level and the localization accuracy. Finally, the experiment with an autonomous mobile robot demonstrates that a map created using the proposed method is sufficient for autonomous driving without losing the localization.","PeriodicalId":178614,"journal":{"name":"J. Robotics Mechatronics","volume":"63 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125636718","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Autonomous Flight Using UWB-Based Positioning System with Optical Flow Sensors in a GPS-Denied Environment","authors":"Yoshiyuki Higashi, Kenta Yamazaki","doi":"10.20965/jrm.2023.p0328","DOIUrl":"https://doi.org/10.20965/jrm.2023.p0328","url":null,"abstract":"This study presents the positioning method and autonomous flight of a quadrotor drone using ultra-wideband (UWB) communication and an optical flow sensor. UWB communication obtains the distance between multiple ground stations and a mobile station on a robot, and the position is calculated based on a multilateration method similar to global positioning system (GPS). The update rate of positioning using only UWB communication devices is slow; hence, we improved the update rate by combining the UWB and inertial measurement unit (IMU) sensor in the prior study. This study demonstrates the improvement of the positioning method and accuracy by sensor fusion of the UWB device, an IMU, and an optical flow sensor using the extended Kalman filter. The proposed method is validated by hovering and position control experiments and also realizes a sufficient rate and accuracy for autonomous flight.","PeriodicalId":178614,"journal":{"name":"J. Robotics Mechatronics","volume":"275 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115247460","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Y. Minami, Hiroshi Okajima, K. Sawada, Kazuma Sekiguchi
{"title":"Special Issue on Navigation and Control Technologies for Autonomous Mobility","authors":"Y. Minami, Hiroshi Okajima, K. Sawada, Kazuma Sekiguchi","doi":"10.20965/jrm.2023.p0229","DOIUrl":"https://doi.org/10.20965/jrm.2023.p0229","url":null,"abstract":"Autonomous mobility, as exemplified by self-driving cars, autonomous mobile robots, drones, etc., is essential to the acceleration and practical application of transportation services and the automation of delivery, guidance, security, and inspection. Therefore, in recent years, expectations have been building for autonomous mobility to grow as a technology that not only improves the convenience and comfort of transportation and the efficiency of logistics but also leads to solutions to various social problems. Various technological elements are required to ensure the safety and quality of autonomous mobility. For example, technology is needed to create environmental maps and automatically determine obstacles based on data acquired by cameras and sensors such as LiDAR. Technologies for planning appropriate routes and controlling robots safely and comfortably are also essential.\u0000 This special issue highlights 24 exciting papers, including 20 research papers, three letters, and one development report. They are related to “recognition,” “decision and planning,” and “control” technologies for autonomous mobile robots, such as self-driving cars and drones. The papers’ keywords are as follows:\u0000 • Collision avoidance, path planning, path tracking control\u0000 • Motion control, attitude control\u0000 • Measurement, position and posture estimation, modeling\u0000 • Point cloud processing\u0000 We would like to express our gratitude to all authors and reviewers, and we hope that this special issue contributes to future research and development in autonomous mobility.","PeriodicalId":178614,"journal":{"name":"J. Robotics Mechatronics","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132170974","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"MGV Obstacle Avoidance Trajectory Generation Considering Vehicle Shape","authors":"Y. Arai, Takashi Sago, Y. Ueyama, M. Harada","doi":"10.20965/jrm.2023.p0262","DOIUrl":"https://doi.org/10.20965/jrm.2023.p0262","url":null,"abstract":"This study investigates the application of obstacle avoidance trajectory generation considering the vehicle shape of a micro ground vehicle by successive convexification and state-triggered constraints. The avoidance trajectory is generated by numerical computation and path-following experiments are conducted to assess the generated trajectory. The numerical computation results indicate that the trajectory obtained by the algorithm successfully avoids obstacles considering the vehicle shape and satisfies the constraints. The experiment includes the model predictive control to follow the generated trajectory. Numerical computations and experiments confirm the usefulness of the trajectory generation algorithm.","PeriodicalId":178614,"journal":{"name":"J. Robotics Mechatronics","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121769784","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Gesture Interface and Transfer Method for AMR by Using Recognition of Pointing Direction and Object Recognition","authors":"T. Ikeda, Naoki Noda, S. Ueki, Hironao Yamada","doi":"10.20965/jrm.2023.p0288","DOIUrl":"https://doi.org/10.20965/jrm.2023.p0288","url":null,"abstract":"This paper describes a gesture interface for a factory transfer robot. Our proposed interface used gesture recognition to recognize the pointing direction, instead of estimating the point as in conventional pointing gesture estimation. When the autonomous mobile robot (AMR) recognized the pointing direction, it performed position control based on the object recognition. The AMR traveled along our unique path to ensure that its camera detected the object to be referenced for position control. The experimental results confirmed that the position and angular errors of the AMR controlled with our interface were 0.058 m and 4.7° averaged over five subjects and two conditions, which were sufficiently accurate for transportation. A questionnaire showed that our interface was user-friendly compared with manual operation with a commercially available controller.","PeriodicalId":178614,"journal":{"name":"J. Robotics Mechatronics","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121267146","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Development of a Spherical Shell Robot with Rolling and Legged Locomotion","authors":"R. Abe, C. Kanamori","doi":"10.20965/jrm.2023.p0483","DOIUrl":"https://doi.org/10.20965/jrm.2023.p0483","url":null,"abstract":"Herein, we propose a spherical shell robot that can roll and move on its legs, and develop a prototype of the robot. Recently, there has been a growing demand for robots that can move freely and gather information on rough terrains, such as disaster sites, which are not accessible to humans. The robot developed here has two types of mobilities: rolling movement using a spherical shape and walking movement using its legs. Because the morphological transformation does not require recombination of parts, it can be reversibly performed via remote control. Therefore, the robot can select the movement method according to the environment, and reach the target point reliably even on uneven terrains, such as a disaster site. We designed a mechanism that enabled the transformation of the form and devised an operation method. Accordingly, a prototype was developed and tested. A rolling test on flat ground confirmed that the robot can roll over 5.0 m and its speed could be controlled using a gyro sensor. The leg locomotion test confirmed that the robot can turn and move straight ahead without turning over. In addition, we also conducted experiments, such as sudden stops and remote morphological deformation, to confirm the operation of the robot during rolling.","PeriodicalId":178614,"journal":{"name":"J. Robotics Mechatronics","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124112607","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Study on Control for Prevention of Collision Caused by Failure of Localization for Map-Based Automated Driving Vehicle","authors":"Shun Nishimura, M. Omae","doi":"10.20965/jrm.2023.p0255","DOIUrl":"https://doi.org/10.20965/jrm.2023.p0255","url":null,"abstract":"In demonstration experiments of automated driving vehicles, lane departures and collisions with roadside structures due to poor vehicle positioning and self-localization have been reported. In this study, we propose a promising method to prevent such departures and collisions, and then validate the proposed method by applying it to an actual automated driving vehicle. The proposed method monitors the target steering angles computed by the automated driving control and limits them before commanded the actuator when there is a risk of colliding with obstacles. As the above-mentioned control is lower-level, it can prevent an automated driving vehicle from colliding with obstacles without complicating upper-level controls. Experiments on an actual automated driving vehicle showed that the steering control structure of the proposed method could prevent an automated driving vehicle from colliding with obstacles by limiting its target steering angle. In addition, the method does not impose excessive limits on the steering angle when the automated driving vehicle follows a normal path and no risk of collision exists.","PeriodicalId":178614,"journal":{"name":"J. Robotics Mechatronics","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114147541","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}