{"title":"Prevention of Forgetting Shopping Items in Response to Shopping List Ambiguity","authors":"Yoshihisa Igimi, Lifeng Zhang","doi":"10.12792/ICIAE2021.018","DOIUrl":"https://doi.org/10.12792/ICIAE2021.018","url":null,"abstract":"Shopping plays a critical role in daily life, and also it is a time-consuming activity. For the coming elder society in Japan, the elders who live alone have to go shopping by themselves and often forget to buy the target goods is in their shopping list. This forgetting is because the shopping list is often made with ambiguous expressions compared to the store registered product name. This research proposes a new shopping item forgetting prevention system that can detect the not buying item immediately after purchasing to dealing with the shopping list’s ambiguity.","PeriodicalId":161085,"journal":{"name":"The Proceedings of The 9th IIAE International Conference on Industrial Application Engineering 2020","volume":"177 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131397708","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Proposal of a Method for Real-time Detection of Obstacle Using Line Laser and Camera","authors":"Masaya Okamoto, Shiyuan Yang, S. Serikawa","doi":"10.12792/ICIAE2021.034","DOIUrl":"https://doi.org/10.12792/ICIAE2021.034","url":null,"abstract":"Automated guided vehicles (AGVs) have been widely used in factories and warehouses. Functions such as obstacle detection are indispensable for unmanned transport robots. We have developed a new approach for obstacle detection using a line laser and a camera. In this study, we improved the detection process of the system, made it realtime, implemented it on a robot, and verified the measurement accuracy of the system. As a result of measuring the distance to the obstacle and the size of the obstacle, the measurement error was small, within 20 mm, and it was confirmed that the system could detect the obstacle with good accuracy.","PeriodicalId":161085,"journal":{"name":"The Proceedings of The 9th IIAE International Conference on Industrial Application Engineering 2020","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-02-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129984387","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Obstacle Detection and Detour System Using Line Laser and Camera","authors":"Miki Suetsugu, Shiyuan Yang, S. Serikawa","doi":"10.12792/ICIAE2021.041","DOIUrl":"https://doi.org/10.12792/ICIAE2021.041","url":null,"abstract":"This paper develops a system that can easily and widely detect and avoid obstacles with one system by using a line laser and a camera. In order to use the automatic guided vehicle safely indoors, it is necessary to detect obstacles and avoid collisions. Therefore, we installed a line laser and a camera on the mobile robot to verify the accuracy of the system that detects and avoids obstacles. Conventional automatic guided vehicles include a method of detecting using a plurality of sensors and a method of detecting using LiDAR. However, these have problems that they are easily affected by disturbance and that the sensor itself is expensive and impractical. In this study, by using a line laser and a camera, it is possible to detect easily and widely with one system.","PeriodicalId":161085,"journal":{"name":"The Proceedings of The 9th IIAE International Conference on Industrial Application Engineering 2020","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-02-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132907944","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Development of a New Outlet Polarity Tester","authors":"K. Amimoto","doi":"10.12792/ICIAE2021.002","DOIUrl":"https://doi.org/10.12792/ICIAE2021.002","url":null,"abstract":"In this paper, we illustrate about a new outlet polarity tester which developed in our company. In recent years, it has been an increase in the opportunity to use the power outlet grounding electrode. This is because the use of grounded outlet has been mandated in the interior wiring regulation. Also in the interior wiring regulation, since it has been mandated to properly connect the polarity, polarity test is an essential test until completion of construction. However, in the current general outlet tester may not be able to confirm a false connection of grounded outlet. Further, the conventional polarity test for performing using commercial power test, it has a problem that can not be tested only after receiving. In order to solve these problems, we have developed a new type of outlet polarity tester. As the result of operation verification, we have confirmed that a new tester has the ability to easily discriminated the 24 type of connection state including a disconnection.","PeriodicalId":161085,"journal":{"name":"The Proceedings of The 9th IIAE International Conference on Industrial Application Engineering 2020","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-02-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131938601","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Relationship between Creative Inspiration, Self-Evaluated Satisfaction and Brain Activity through Simplified Music Composition","authors":"Hiromu Sato, Yuya Chiba, K. Moriya, M. Nakagawa","doi":"10.12792/ICIAE2021.023","DOIUrl":"https://doi.org/10.12792/ICIAE2021.023","url":null,"abstract":"The human prefrontal cortex (PFC) is an important target for a research which attempt to elucidate and control at will of higher brain function using non-invasive neuroimaging techniques. Particularly, creativity is specific higher human brain function and integral for evolution of human society. Creative inspirations generated during creative activity often solve hard problems and lead us in the good direction. The present study examined the PFC activity when creative inspiration happened and an influence of difference of brain activity on quality of creation, using the simplified music composition task and near infrared spectroscopy (NIRS). Deeply understanding these brain function possibly builds a foundation for development of methods to stimulate brain activation for creativity. The PFC activity measurements were achieved by measuring oxygen metabolism in cerebral blood flow (CBF) using 10-channel wearable optical topography (WOT-100, NeU Ltd.). We invested new simplified method of music composition to analyze brain activity of a person without instruments experience. A report of creative inspirations was conducted by raising subject’s hand as possible as small during the music composition task and we compared statistically brain activity of before and after inspirations. Correlation coefficient between brain activity and self-evaluation measured by visual analog scale (VAS) was investigated.The present research found that right ventrolateral PFC is activated by creative inspiration. Significant brain activation of 4 subjects out of participated 5 subjects was observed (P 0.05). Simplification of music composition task possibly mask clear correlation. In conclusion, the main finding that right PFC activation by inspirations indicated important mechanism of creativity.","PeriodicalId":161085,"journal":{"name":"The Proceedings of The 9th IIAE International Conference on Industrial Application Engineering 2020","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114554398","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Shi Zhou, He Li, Miaomiao Zhu, Zhen Li, Lifeng Zhang, M. Mizumachi
{"title":"Stereo Matching Based on Features of Image Patch","authors":"Shi Zhou, He Li, Miaomiao Zhu, Zhen Li, Lifeng Zhang, M. Mizumachi","doi":"10.12792/ICIAE2021.006","DOIUrl":"https://doi.org/10.12792/ICIAE2021.006","url":null,"abstract":"Stereo matching is a branch of 3D vision and has a wide range of applications in 3D reconstruction and autonomous driving. Recently, stereo matching methods leverage the information of two full image to calculate disparity map. However, these methods still have difficulties in texture-less regions and occlusion regions, and post-processing is used to improve accuracy. Therefore, there is a large computational cost in the feature extraction and post-processing. In this paper, we propose a stereo matching method based only on features of image patches and predict the disparity of region without occlusion. And post-processing is performed to modify all kind of mismatching based on the correct disparity. Furthermore, we evaluated our proposed method on the Middlebury dataset. The results show that our method performs well in all areas.","PeriodicalId":161085,"journal":{"name":"The Proceedings of The 9th IIAE International Conference on Industrial Application Engineering 2020","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-01-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128541510","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Proposal of Cursor Operation Method by Facial Movements for the Handicapped","authors":"Shunya Mito, Lifeng Zhang","doi":"10.12792/ICIAE2021.003","DOIUrl":"https://doi.org/10.12792/ICIAE2021.003","url":null,"abstract":"In recent years, the number of people with physical disabilities has been on the rise, with about half of them being physically disabled limb. A physically disabled person is a person with a physical disability that interferes with daily life, such as missing or malfunctioning fingers. This study focused on the use of computers in daily life. A person with a limb disability used a PC.Earlier studies that made \"eye contact\", \"voice input\", \"facial expressions using a 3D camera\". Those problems are the need to keep the distance between the face and the camera constant, the inability to protect privacy when people are around, and that 3D cameras are difficult and expensive to install. In this study, by using a webcam as input and machine learning, image processing techniques. Developing a system that allows a handicapped person to operate a computer with only facial movements without the use of arms or hands. We also evaluated the operability, improving the problems of previous studies.","PeriodicalId":161085,"journal":{"name":"The Proceedings of The 9th IIAE International Conference on Industrial Application Engineering 2020","volume":"444 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-01-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116751690","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Proposal for Creating Syllabic Datasets for Japanese Language Lipreading by Using Machine Learning","authors":"Rui Kitahara, Lifeng Zhang","doi":"10.12792/ICIAE2021.004","DOIUrl":"https://doi.org/10.12792/ICIAE2021.004","url":null,"abstract":"Although lip-reading using image processing and machine learning has been mainly performed at the word level, it has been shown that using LipNet, a network that enables recognition at the sentence level, improves the recognition accuracy over the former method. However, this was the case for English speakers. In this study, the data set was created based on speech scenes containing all 50 Japanese sounds, and the recognition accuracy was evaluated using LipNet.","PeriodicalId":161085,"journal":{"name":"The Proceedings of The 9th IIAE International Conference on Industrial Application Engineering 2020","volume":"127 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-01-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114432972","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ryoya Goto, K. Kiyota, M. Shimakawa, Koichiro Watanabe, Chiharu Okuma
{"title":"Development of Training Game Application using Eye-gaze Control Technology to Support Employment of Physically challenged people","authors":"Ryoya Goto, K. Kiyota, M. Shimakawa, Koichiro Watanabe, Chiharu Okuma","doi":"10.12792/ICIAE2021.026","DOIUrl":"https://doi.org/10.12792/ICIAE2021.026","url":null,"abstract":"Assistive devices using eye-gaze control technology have been developed to control personal computers (PCs) so that they can be used by children and adults who have significant physical disabilities. In this time, we focused on human eye movements, which can be very effective as final input controls because they can often provide a means of input for people suffering from extensive physical disabilities. We describe a game application developed to assist persons when they practice controlling their PCs using eye-gaze control. Subjects were given a trial of an eye-controlled game in which they performed the task of counting vehicles. From the experimental results, we have confirmed the effectiveness of our proposed training application, although it needs to be reconsidered for extended use.","PeriodicalId":161085,"journal":{"name":"The Proceedings of The 9th IIAE International Conference on Industrial Application Engineering 2020","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-01-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132451379","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Perception of Approaching Objects in Bilateral Control Using Proximity Sensor","authors":"Takumi Karato, T. Nozaki, H. Krebs, T. Murakami","doi":"10.12792/ICIAE2021.028","DOIUrl":"https://doi.org/10.12792/ICIAE2021.028","url":null,"abstract":"In this paper, a new method for the perception of approaching objects in bilateral control using a hybrid proximity/force sensor is proposed.Recently, a demand for teleoperation which can accomplish safe, compliant and complicated tasks in a remote area is increasing. One method for teleoperation is bilateral control which can control both position and force in the master and the slave side. In bilateral control, an operator usually gets the environmental information of the slave side with vision sensors. However, in typical conditions, for example, when the interrupting objects exist between the vision sensor and end-effector like the body of the manipulator, the operator cannot observe the objects near the end-effector. So in this paper, for getting the complimentary information about environments, a new method for the perception of approaching objects in bilateral control using a hybrid proximity/force sensor is proposed. The hybrid proximity/force sensor can measure both distance between end-effector and objects and contact force. The approaching objects can be detected before objects touch the end-effector with this sensor. The operator can feel the approaching objects by vibrating only the manipulator on the master side. Hence, the operator can realize safe and soft contact with the hybrid sensor even when information of the vision sensor cannot be used. The effectiveness of the proposed method is verified by the experiments.","PeriodicalId":161085,"journal":{"name":"The Proceedings of The 9th IIAE International Conference on Industrial Application Engineering 2020","volume":"65 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132931733","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}