T. Matsuzawa, K. Hashimoto, Xiao Sun, Tomotaka Teramachi, S. Kimura, Nobuaki Sakai, Y. Yoshida, Asaki Imai, Kengo Kumagai, Takanobu Matsubara, Koki Yamaguchi, W. Tan, A. Takanishi
{"title":"Crawling gait generation method for four-limbed robot based on normalized energy stability margin","authors":"T. Matsuzawa, K. Hashimoto, Xiao Sun, Tomotaka Teramachi, S. Kimura, Nobuaki Sakai, Y. Yoshida, Asaki Imai, Kengo Kumagai, Takanobu Matsubara, Koki Yamaguchi, W. Tan, A. Takanishi","doi":"10.1109/SSRR.2017.8088167","DOIUrl":"https://doi.org/10.1109/SSRR.2017.8088167","url":null,"abstract":"In this paper, we describe a gait generation method for the crawling motion of a legged robot using Normalized Energy Stability Margin (NESM). The crawling motion is a method of locomotion that, since the robot is very close to a state of falling, its leg and torso are grounded alternately in order to enable the robot to move with a low center of gravity. It has the benefit of decreasing the impact experienced by the robot and reduces the risk of becoming damaged if it falls over. However, during the phase where only the robot's torso is in contact with the ground, the size of robot's support area is smaller than the case when its legs are in contact with the ground. This decrease in support area may cause the robot to fall or tip over sideways in the direction where the edge of robot's cuboid torso is providing the most support on an inclined surface. As a result, the robot's feet may collide with the road's surface when its legs are moving forward and prevent the robot from performing its crawling motion. To deal with this problem, we propose a method of gait generation for the crawling motion based on a stability criteria. Depending on the stability criteria, this method involves the selection of a stance, with which it lifts its torso and a way of controlling the landing height of the robot's feet depending on the unevenness of the surface of the road. In experiments, it has been confirmed that stability was improved when the four-limbed robot performed the crawling motion using the proposed method on an inclined road surface.","PeriodicalId":403881,"journal":{"name":"2017 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR)","volume":"47 35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123493157","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
K. Hashimoto, S. Kimura, Nobuaki Sakai, Shinya Hamamoto, Ayanori Koizumi, Xiao Sun, T. Matsuzawa, Tomotaka Teramachi, Y. Yoshida, Asaki Imai, Kengo Kumagai, Takanobu Matsubara, Koki Yamaguchi, Gan Ma, A. Takanishi
{"title":"WAREC-1 — A four-limbed robot having high locomotion ability with versatility in locomotion styles","authors":"K. Hashimoto, S. Kimura, Nobuaki Sakai, Shinya Hamamoto, Ayanori Koizumi, Xiao Sun, T. Matsuzawa, Tomotaka Teramachi, Y. Yoshida, Asaki Imai, Kengo Kumagai, Takanobu Matsubara, Koki Yamaguchi, Gan Ma, A. Takanishi","doi":"10.1109/SSRR.2017.8088159","DOIUrl":"https://doi.org/10.1109/SSRR.2017.8088159","url":null,"abstract":"This paper presents a novel four-limbed robot, WAREC-1 having high locomotion ability with versatility in locomotion styles. At disaster sites, there are various types of environments where a robot must move such as rough terrain with possibility of collapse, narrow places, stairs, vertical ladders and so on. WAREC-1 moves in hazardous environments by changing locomotion styles: bipedal/quadrupedal walking, crawling, and ladder climbing. WAREC-1 has commonly structured limbs with 28-DoFs in total with 7-DoFs in each limb. The robot is 1,690 mm tall when standing on two limbs and weighs 155 kg. We developed three types actuator units with hollow structure to pass the wiring inside the joints of WAREC-1, which enables the robot to move on rubble by creeping on its stomach. The body has a concave shape, and the end-effector has hook-like shape. Verification of the WAREC-1 robot is conducted through experiments.","PeriodicalId":403881,"journal":{"name":"2017 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR)","volume":"100 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116798474","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ryuya Sato, Mitsuhiro Kamezaki, Satoshi Niuchi, S. Sugano, H. Iwata
{"title":"A pre-offering view system for teleoperators of heavy machines to acquire cognitive maps","authors":"Ryuya Sato, Mitsuhiro Kamezaki, Satoshi Niuchi, S. Sugano, H. Iwata","doi":"10.1109/SSRR.2017.8088141","DOIUrl":"https://doi.org/10.1109/SSRR.2017.8088141","url":null,"abstract":"In teleoperation of heavy machines, work efficiency will be 50% lower than manned operation because operators cannot obtain effective information about work sites due to the limitation of current monitoring systems. Operators would have opportunities to obtain such information before work (about a week required to introduce teleoperation systems) and during work. As a fundamental study to support operator's spatial cognition, we developed views to provide spatial information of work sites before work. Humans have cognitive maps which are created based on knowledge acquired from survey and route perspectives. To make operators acquire the above two knowledge, we provide a bird's-eye view that can be changed by operators to acquire a knowledge from survey perspective, and a view from operator's viewpoint that can be changed by operator's intention to acquire a knowledge from route perspective. To evaluate two pre-offering views, we preformed experiments using a virtual reality simulator. The results indicated that a view to acquire a knowledge from survey perspective could help operators plan totally and one to acquire a knowledge from route perspective could help operators plan locally, and could increase work efficiency.","PeriodicalId":403881,"journal":{"name":"2017 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR)","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133755136","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Kui-Ting Chen, Mitsuhiro Kamezaki, Takahiro Katano, Taisei Kaneko, Kohga Azuma, Yusuke Uehara, T. Ishida, M. Seki, Ken Ichiryu, S. Sugano
{"title":"A preliminary study on a groping framework without external sensors to recognize near-environmental situation for risk-tolerance disaster response robots","authors":"Kui-Ting Chen, Mitsuhiro Kamezaki, Takahiro Katano, Taisei Kaneko, Kohga Azuma, Yusuke Uehara, T. Ishida, M. Seki, Ken Ichiryu, S. Sugano","doi":"10.1109/SSRR.2017.8088161","DOIUrl":"https://doi.org/10.1109/SSRR.2017.8088161","url":null,"abstract":"This paper proposes a basic near-environmental recognition framework based on groping for risk-tolerance disaster response robot (DRR). In extreme disaster sites, including high radiation and heavy smog, external sensors such as cameras and laser range finders do not work properly, and such sensors may be broken in accidents in the tasks. It is hoped that DRRs can continue to perform tasks, even if the external sensors cannot work, and at least, they can safely evacuate from the site. In this preliminary study, for recognizing near environments without using external sensors, we proposed a groping method. In this method, a robot actively touches the environment using arms or other movable parts, records the contact information, and then reconstructs a three-dimensional local map around the robot by the detected information, e.g., robot arm's position and reactive force. The proposed groping system can recognize the existence of three situations, such as an object, step, and pit, and those geometry, by exploring the designated space using arms. The groping strategy was designed considering both robot specification, time limitation, and required resolution. Experiments were performed using four-arm and four-crawler robot OCTOPUS. The results indicate that the proposed framework could recognize step, pit, and object, and calculate the position and size of the object, and confirm that the robot successfully removed the object on the basis of groped data.","PeriodicalId":403881,"journal":{"name":"2017 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR)","volume":"59 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114665340","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
T. Park, C. Jeong, Jaeseong Lee, Seonghun Lee, Ikho Lee, Hyeonjung Kim, J. Ahn, D. Yun
{"title":"Design of special end effectors for first aid robot","authors":"T. Park, C. Jeong, Jaeseong Lee, Seonghun Lee, Ikho Lee, Hyeonjung Kim, J. Ahn, D. Yun","doi":"10.1109/SSRR.2017.8088160","DOIUrl":"https://doi.org/10.1109/SSRR.2017.8088160","url":null,"abstract":"This paper proposes the design of various end effectors for first aid robots. There is a growing demand for special robots that perform rescue operations on behalf of people for rescue in dangerous areas such as disasters or wars. In addition, when injuries occur, first aid treatment is also necessary. In this paper, we design user and system requirements for robots performing first aid and propose a design of end effector suitable for various first aid work.","PeriodicalId":403881,"journal":{"name":"2017 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR)","volume":"116 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116115379","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Evan Beachly, Carrick Detweiler, Sebastian G. Elbaum, D. Twidwell, Brittany A. Duncan
{"title":"UAS-Rx interface for mission planning, fire tracking, fire ignition, and real-time updating","authors":"Evan Beachly, Carrick Detweiler, Sebastian G. Elbaum, D. Twidwell, Brittany A. Duncan","doi":"10.1109/SSRR.2017.8088142","DOIUrl":"https://doi.org/10.1109/SSRR.2017.8088142","url":null,"abstract":"This paper presents the development of an interface for small Unmanned Aerial Systems to allow the deployment of ignition spheres at a prescribed fire, real-time fire modeling, and user updates to the automated fire model. Current systems are limited to fire monitoring or modeling, generally rely on a desktop computer, and do not allow updates to the model nor parameter adjustments in the field. The novelty in the current approach is in enabling user control of all aspects of flight, including take-off, waypoint navigation, payload delivery, and landing from the interface while also allowing fire modeling and incorporating this information into the flight planning to increase safety and effectiveness of the vehicle. This system will allow fire experts to reach previously inaccessible terrain to ignite controlled burns, model fire progression through novel terrain and vegetation to improve current models, and allow team members to maintain higher levels of situation awareness through the ability to project fire spread at future times. Initial user testing at a 40 acre prescribed burn shows that the model is considerably more accurate with user corrections, and that even half the user corrections dramatically reduced the distance between the projected and actual fire lines. Future tests are planned with more users in challenging terrain to provide new information to the fire management communities.","PeriodicalId":403881,"journal":{"name":"2017 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114984642","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Bai Li, Youmin Zhang, Ning Jia, Changjun Zhou, Yuming Ge, Hong Liu, Wei Meng, Ce Ji
{"title":"Paving green passage for emergency vehicle in heavy traffic: Real-time motion planning under the connected and automated vehicles environment","authors":"Bai Li, Youmin Zhang, Ning Jia, Changjun Zhou, Yuming Ge, Hong Liu, Wei Meng, Ce Ji","doi":"10.1109/SSRR.2017.8088156","DOIUrl":"https://doi.org/10.1109/SSRR.2017.8088156","url":null,"abstract":"This paper describes a real-time multi-vehicle motion planning (MVMP) algorithm for the emergency vehicle clearance task. To address the inherent limitations of human drivers in perception, communication, and cooperation, we require that the emergency vehicle and the surrounding normal vehicles are connected and automated vehicles (CAVs). The concerned MVMP task is to find cooperative trajectories such that the emergency vehicle can efficiently pass through the normal vehicles ahead. We use an optimal-control based formulation to describe the MVMP problem, which is centralized, straightforward, and complete. For the online solutions, the centralized MVMP formulation is converted into a multi-period and multi-stage version. Concretely, each period consists of two stages: the emergency vehicle and several normal CAVs ahead try to form a regularized platoon via acceleration or deceleration (stage 1); when a regularized platoon is formed, these vehicles act cooperatively to make way for the emergency vehicle until the emergency vehicle becomes the leader in this local platoon (stage 2). When one period finishes, the subsequent period begins immediately. This sequential process continues until the emergency vehicle finally passes through all the normal CAVs. The subproblem at stage 1 is extremely easy because nearly all the challenging nonlinearity gathers only in stage 2; typical solutions to the subproblem at stage 2 can be prepared offline, and then implemented online directly. Through this, our proposed MVMP algorithm avoids heavy online computations and thus runs in real time.","PeriodicalId":403881,"journal":{"name":"2017 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR)","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125866706","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Abu Ubaidah bin Shamsudin, Naoki Mizuno, Jun Fujita, K. Ohno, Ryunosuke Hamada, Thomas Westfechtel, S. Tadokoro, H. Amano
{"title":"Evaluation of LIDAR and GPS based SLAM on fire disaster in petrochemical complexes","authors":"Abu Ubaidah bin Shamsudin, Naoki Mizuno, Jun Fujita, K. Ohno, Ryunosuke Hamada, Thomas Westfechtel, S. Tadokoro, H. Amano","doi":"10.1109/SSRR.2017.8088139","DOIUrl":"https://doi.org/10.1109/SSRR.2017.8088139","url":null,"abstract":"Firefighter robot autonomy is important for fire disaster response robotics. SLAM is a key technology for the autonomy. We want to know if SLAM can be used in fire disasters. However, evaluating SLAM in an actual fire disaster is not possible because we cannot generate large fires in actual petrochemical complexes. In this study, we simulated a fire disaster, collected sensor data for different conditions in the fire disaster, and evaluated the accuracy of the SLAM. The fire effect for LIDAR was analyzed and the effect embedded in the LIDAR measurement simulator. Several sensor interval parameters used by a heat protection cover was also analyzed for protecting sensor from heat. The evaluation result show the best parameter is 1 s measurement and 9 s sensor cooling which the average accuracy of GPS and LIDAR based SLAM was in the range 0.25 — 0.36 m in the most difficult scenario in the petrochemical complex, has dimensions 1000 m × 600 m. Using the simulator enables us to evaluate the best interval parameter of GPS and LIDAR based SLAM at the fire disaster. The knowledge from the fire effect of the LIDAR could be used to improve LIDAR measurement in actual fire disasters.","PeriodicalId":403881,"journal":{"name":"2017 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR)","volume":"80 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130053815","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Xieyuanli Chen, Hui Zhang, Huimin Lu, Junhao Xiao, Qihang Qiu, Yi Li
{"title":"Robust SLAM system based on monocular vision and LiDAR for robotic urban search and rescue","authors":"Xieyuanli Chen, Hui Zhang, Huimin Lu, Junhao Xiao, Qihang Qiu, Yi Li","doi":"10.1109/SSRR.2017.8088138","DOIUrl":"https://doi.org/10.1109/SSRR.2017.8088138","url":null,"abstract":"In this paper, we propose a monocular SLAM system for robotic urban search and rescue (USAR), based on which most USAR tasks (e.g. localization, mapping, exploration and object recognition) can be fulfilled by rescue robots with only a single camera. The proposed system can be a promising basis to implement fully autonomous rescue robots. However, the feature-based map built by the monocular SLAM is difficult for the operator to understand and use. We therefore combine the monocular SLAM with a 2D LIDAR SLAM to realize a 2D mapping and 6D localization SLAM system which can not only obtain a real scale of the environment and make the map more friendly to users, but also solve the problem that the robot pose cannot be tracked by the 2D LIDAR SLAM when the robot climbing stairs and ramps. We test our system using a real rescue robot in simulated disaster environments. The experimental results show that good performance can be achieved using the proposed system in the USAR. The system has also been successfully applied and tested in the RoboCup Rescue Robot League (RRL) competitions, where our rescue robot team entered the top 5 and won the Best in Class small robot mobility in 2016 RoboCup RRL Leipzig Germany, and the champions of 2016 and 2017 RoboCup China Open RRL competitions.","PeriodicalId":403881,"journal":{"name":"2017 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR)","volume":"65 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131557600","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Proposal of simulation platform for robot operations with sound","authors":"M. Shimizu, Tomoichi Takahashi","doi":"10.1109/SSRR.2017.8088143","DOIUrl":"https://doi.org/10.1109/SSRR.2017.8088143","url":null,"abstract":"In recent natural disasters, robots have played an important role in search and rescue operations in places that are not easily accessible to humans. The key functions of robots in search and rescue operations are mobility in rough terrain, monitoring of surroundings when searching for victims, and creating disaster maps. A robot test field should provide reaction loops between operators, robots, and the environment, with natural information for human robot operators. Simulations should therefore provide more realistic information, more naturally. We propose a simulation platform with realistic sound reactions from robot operations and noise from the environment. This paper proposes and discusses the need for simulating inspection tasks to include sound information, and presents new tasks using sound. A prototype shows that the use of sound makes robot simulation applications more robust.","PeriodicalId":403881,"journal":{"name":"2017 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR)","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127089247","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}