{"title":"In-Car Safety Judgment Using Machine Learning","authors":"Y. Tomita, S. Kato, M. Itami","doi":"10.1109/IEEECONF49454.2021.9382762","DOIUrl":"https://doi.org/10.1109/IEEECONF49454.2021.9382762","url":null,"abstract":"In the case of a mobility service using an automated vehicle, it is necessary to determine the safety of the passengers inside and outside the vehicle before starting the vehicle, and automation is required. Therefore, we have developed a system that uses camera image processing to determine the safety of passengers in and out of a vehicle. This paper describes a system for determining the passenger's status using a machine learning based image analysis method.","PeriodicalId":395378,"journal":{"name":"2021 IEEE/SICE International Symposium on System Integration (SII)","volume":"114 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116431413","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"HAIROWorldPlugin: a Choreonoid plugin for virtually configuring decommissioning task environment for the robots","authors":"Kenta Suzuki, K. Kawabata","doi":"10.1109/IEEECONF49454.2021.9382771","DOIUrl":"https://doi.org/10.1109/IEEECONF49454.2021.9382771","url":null,"abstract":"This paper describes the HAIROWorldPlugin which is a plugin for Choreonoid. We have developed the plugin functions to virtually conFigure decommissioning task environments and situations for remotely operated robots by referring decommissioning works have conducted in Fukushima Daiichi Nuclear Power Station of Tokyo Electric Power Company Holdings, Inc. (FDNPS). HAIROWorldPlugin is a package of our previous development results such as Fluid Dynamics Simulator, Visual Effect Simulator, Communication Traffic Simulator, Motion Recorder, Model File Explorer, Crawler Robot Builder, Terrain Builder, and Operation Command Manager. Each function is explained, and the examples of demonstrations are presented.","PeriodicalId":395378,"journal":{"name":"2021 IEEE/SICE International Symposium on System Integration (SII)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129511406","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Inmo Jang, Hanlin Niu, Emily C. Collins, A. Weightman, J. Carrasco, B. Lennox
{"title":"Virtual Kinesthetic Teaching for Bimanual Telemanipulation","authors":"Inmo Jang, Hanlin Niu, Emily C. Collins, A. Weightman, J. Carrasco, B. Lennox","doi":"10.1109/IEEECONF49454.2021.9382763","DOIUrl":"https://doi.org/10.1109/IEEECONF49454.2021.9382763","url":null,"abstract":"This paper proposes a novel telemanipulation system that enables a human operator to control a dual-arm robot. The operation provides kinesthetic teaching via a digital twin of the robot which the operator cyber-physically guides to perform a task. Its key enabler is the concept of a virtual reality interactive marker, which serves as a simplified end effector of the digital twin robot. In virtual reality, the operator can interact with the marker using bare hands, which are sensed by the Leap Motion on top of a virtual reality headset. Then, the status (e.g. position/orientation) of the marker is transformed to the corresponding joint space command to the remote robot so that its end effector can follow the marker. We provide the details of the system architecture, and implement the system based on commercial robots/devices (i.e. UR5, Robotiq gripper, Leap Motion), virtual reality, ROS, and Unity3D. Moreover, the paper discusses the technical challenges that we had to address, and the system’s potential benefits from a human-robot interaction perspective.","PeriodicalId":395378,"journal":{"name":"2021 IEEE/SICE International Symposium on System Integration (SII)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130413612","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Human Pose Recognition under Cloth-like Objects from Depth Images using a Synthetic Image Dataset with Cloth Simulation","authors":"Shunsuke Ochi, J. Miura","doi":"10.1109/IEEECONF49454.2021.9382627","DOIUrl":"https://doi.org/10.1109/IEEECONF49454.2021.9382627","url":null,"abstract":"This paper proposes a method of human pose recognition when the body is largely covered by cloth-like objects such as blankets. Such a recognition is useful for robotic monitoring of the elderly and the disabled. Human pose recognition under cloth-like object is challenging due to a large variety of the shape of covering objects. Since we would like to use depth images for addressing privacy and illumination issues, it further makes the problem difficult. In this paper, we utilize computer graphics tools including cloth simulation for generating a synthetic dataset, which is then used for training a deep neural network for body parts segmentation. We achieved around 90% accuracy in synthetic data and show the effectiveness of simulating cloth-like objects in data generation. We also applied it to real data and examined the results for identifying remaining issues.","PeriodicalId":395378,"journal":{"name":"2021 IEEE/SICE International Symposium on System Integration (SII)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131102486","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
K. Hattori, K. Nozawa, Suzuka Fujita, Sousuke Nakamura
{"title":"Proposal of blood flow promotion method using electrical stimulation synchronized with ECG*","authors":"K. Hattori, K. Nozawa, Suzuka Fujita, Sousuke Nakamura","doi":"10.1109/IEEECONF49454.2021.9382699","DOIUrl":"https://doi.org/10.1109/IEEECONF49454.2021.9382699","url":null,"abstract":"Prolonged seated posture can cause thrombosis in the veins of the lower extremities, leading to deep vein thrombosis (DVT) and pulmonary thromboembolism (PTE). To overcome these problems, it has been proposed to stimulate blood flow through the lower limb muscles by repeated contraction and relaxation of the pump. In this study, we hypothesize that the timing of contraction in the lower limbs can be synchronized with the timing of heart relaxation and blood pumping from the veins to maximize blood flow, and we have developed a system that synchronizes the timing of stimulation with the timing of maximum blood flow velocity in the veins of the lower limbs based on the timing of the measured electrocardiogram.","PeriodicalId":395378,"journal":{"name":"2021 IEEE/SICE International Symposium on System Integration (SII)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129842879","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Picking of One Sheet of Cotton Cloth by Rolling up Using Cylindrical Brushes","authors":"Y. Kawasaki, S. Arnold, Kimitoshi Yamazaki","doi":"10.1109/IEEECONF49454.2021.9382770","DOIUrl":"https://doi.org/10.1109/IEEECONF49454.2021.9382770","url":null,"abstract":"In this paper, we describe a method for automating the process of lifting one sheet of cloth from a stack of cotton sheets. In factory manufacturing of cloth products, many of the procedures for installing fabric parts on machines are still performed manually. In this study, we propose a method that first detects the edge of a sheet of cotton cloth, and then lifts it up by means of a cylindrical brush. Advantages of this method are that it avoids cloth damage, as well as its future potential for enabling dexterous manipulation by means of fine brushes. In verification experiments, the proposed end-effector was attached to the tip of a serial link manipulator. Then a system combining a color camera, a tactile sensor, and a light source was constructed. We evaluate the system and report its performance on a cloth picking task. In addition, we categorized the failure modes of the picking task and devised a vision process for distinguishing between them.","PeriodicalId":395378,"journal":{"name":"2021 IEEE/SICE International Symposium on System Integration (SII)","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130229329","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Development and Evaluation of a Tomato Fruit Suction Cutting Device","authors":"T. Fujinaga, S. Yasukawa, K. Ishii","doi":"10.1109/IEEECONF49454.2021.9382670","DOIUrl":"https://doi.org/10.1109/IEEECONF49454.2021.9382670","url":null,"abstract":"This paper introduces a structure and harvesting motion for the suction cutting device of a tomato harvesting robot, and reports harvesting experiments conducted in a tomato greenhouse. The suction cutting device comprises a suction part and a cutting part. The suction part separates the target fruit from a tomato cluster and the cutting part cuts the peduncle of the target fruit. A photoresistor in the cutting part assesses whether or not the target fruit is harvestable, and the cutting motion is performed only when the fruit is assessed as harvestable. The harvesting experiments were conducted in a tomato greenhouse to evaluate the suction cutting device. In this experiments, 50 tomato clusters were randomly selected as the harvesting objects, and there were 203 tomato fruits (including immature fruits). Out of the 203 tomato fruits, 114 tomato fruits were mature and within the robot workspace. Out of the 114 tomato fruits, 105 tomato fruits were recognized as target fruits by the harvesting robot. Out of these 105 tomato fruits, 65 tomato fruits were assessed as harvestable and 55 were successfully harvested (the harvesting success rate was 85%). Based on the results of the harvesting experiments, this study clarified the issues of the suction cutting device, classified the fruits according to whether they were easy or difficult to harvest, and evaluated the fruit characteristics qualitatively.","PeriodicalId":395378,"journal":{"name":"2021 IEEE/SICE International Symposium on System Integration (SII)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128681061","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Tsuyoshi Sato, Toshiya Nakaigawa, N. Hamada, Y. Mitsukura
{"title":"Robust Heartbeat Interval Estimation Method against Various Postures on Bed Using Contactless Measurement","authors":"Tsuyoshi Sato, Toshiya Nakaigawa, N. Hamada, Y. Mitsukura","doi":"10.1109/IEEECONF49454.2021.9382689","DOIUrl":"https://doi.org/10.1109/IEEECONF49454.2021.9382689","url":null,"abstract":"The purpose of this thesis is to use contactless measurement, to propose a robust system for heartbeats interval estimation that enables easy and stable health monitoring in the daily life. In recent years, cardiovascular disease is a major cause of death. Ballisto cardiogram (BCG), which records the mechanical activity of the heart, has been studied as a method provided an unobtrusive measurement. This technique provides the possibility of observing health status without causing any discomfort however, the signal quality can highly vary due to artifacts associated with breathing or postures of a user. Therefore, this study assesses the robust algorithm to estimate heartbeats interval from BCG signal measured by high-sensitive load sensors that mounted to bed legs. Three healthy subjects participated in the experiments that they lied on the bed with various postures. As a result, mean beat-to-beat interval errors were less than about 50 ms when subjects held in decubitus.","PeriodicalId":395378,"journal":{"name":"2021 IEEE/SICE International Symposium on System Integration (SII)","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126574594","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Development and Testing of Garbage Detection for Autonomous Robots in Outdoor Environments","authors":"Yuki Arai, Renato Miyagusuku, K. Ozaki","doi":"10.1109/IEEECONF49454.2021.9382646","DOIUrl":"https://doi.org/10.1109/IEEECONF49454.2021.9382646","url":null,"abstract":"In Japan, there is a growing concern about labor shortages due to the declining birthrate and aging population, and there are high expectations for robots to help solve such social problems and create industries. However, due to the prohibition of public road tests in Japan, there are few examples of actual applications of robots. Therefore, considerations and problems in the practical application of robots are still unclear. In this paper, by focusing on the implementation of garbage collection technology, we have developed an autonomous garbage collection robot using deep learning. In addition, we have verified the usefulness of our garbage detection technology in outdoor environments by conducting actual demonstrations at HANEDA INNOVATION CITY, which is a large-scale commercial and business complex belonged private property, Utsunomiya University, and Nakanoshima Challenge 2019, which is a field of demonstration experiment in the outdoor environment. Our garbage detector was designed to detect cans, plastic bottles, and lunch boxes automatically. Through experiments on test data and outdoor experiments in the real-world, we have confirmed that our detector has a 95.6% Precision and 96.8% Recall. Conparisons to other state-of-the-art detectors are also presented.","PeriodicalId":395378,"journal":{"name":"2021 IEEE/SICE International Symposium on System Integration (SII)","volume":"56 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127642426","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Reference ZMP Generation for Teleoperated Bipedal Robots Walking on Non-Flat Terrains","authors":"T. Ando, T. Watari, Ryo Kikuuwe","doi":"10.1109/IEEECONF49454.2021.9382614","DOIUrl":"https://doi.org/10.1109/IEEECONF49454.2021.9382614","url":null,"abstract":"This paper proposes a method for generating the reference Zero Moment Point (ZMP) for teleoperated bipedal robots walking on non-flat terrains. It consists the preview control and auxiliary ZMP method to realize ZMP-based walking. An auxiliary ZMP is used to generate the COG trajectory in realtime by reducing the time delay caused by the preview control. The problem with such ZMP-based walking is that a ZMP cannot be defined when the both feet are not on a same plane. Hence, in the proposed method, the virtual ZMP method is applied to calculate the ZMP. In order to calculate the virtual ZMP, the proposed method determines a virtual plane from the positional relation between the feet. This method does not use environmental information to determine the virtual plane. Therefore, this method is suitable to generate the reference ZMP of the teleoperated bipedal robot. The proposed method is validated through a realtime simulation environment involving haptic devices.","PeriodicalId":395378,"journal":{"name":"2021 IEEE/SICE International Symposium on System Integration (SII)","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125736447","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}