{"title":"Vision System for Detecting and Locating Micro-Scale Objects with Guided Cartesian Robot","authors":"Naritpon Chavitranuruk, E. Pengwang","doi":"10.1109/ACIRS58671.2023.10240777","DOIUrl":"https://doi.org/10.1109/ACIRS58671.2023.10240777","url":null,"abstract":"This paper aims to use a vision system to detect and locate the position of microscale workpiece on the GelPak & wafer ring and command the end-effector to move to the targeted positions by using 2 sets of cameras. The object that is investigated in this design is Chip On-Sub-mount Assembly (COSA) with a footprint of ${270}times {270}$ micrometers. The first camera is used to find the COSA's position and count the quantity of the COSA on the GelPak and wafer ring. The second camera is used to find the exact position of the COSA and then the end-effector is moved to the target position on COSA's sub-mount. The challenges in this process are accuracy, system integration, and the compliance of the GelPak and the wafer ring. With this proposed design, the performances of the system are examined and validated for the working prototype. The result of the wide camera for finding the approximate center position of the COSAs is 94% successful, the percentage of counting COSA is 90% accuracy. The XY position error from the given position compared to the center position of COSA's sub-mount is ${4}.{882} {mu} {m}$ and ${8}.{206} {mu}{m}$ for x and y axis respectively.","PeriodicalId":148401,"journal":{"name":"2023 8th Asia-Pacific Conference on Intelligent Robot Systems (ACIRS)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-07-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133684456","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Rusydi, Elita Amrina, Yoan Winata, Salisa Asyarina Ramadhani, R. Nofendra
{"title":"Threshold-Based Electroencephalography Brain-Computer Interface for Robot Arm Control","authors":"M. Rusydi, Elita Amrina, Yoan Winata, Salisa Asyarina Ramadhani, R. Nofendra","doi":"10.1109/ACIRS58671.2023.10240250","DOIUrl":"https://doi.org/10.1109/ACIRS58671.2023.10240250","url":null,"abstract":"Brain-Computer Interface (BCI) is a technique that uses real-time brain impulses to connect with and control external devices. BCI provides a new method for controlling external devices by translating brain signals into computer commands, facilitating the daily lives of people with disabilities and enhancing their ability to exhibit expected behavior. A Brain-Computer Interface (BCI) system based on Electroencephalography (EEG) was built to control the robotic arm. The EEG signals utilized included both eyes blinking, the right eye, the left eye, and the jaw contraction. EEG data were recorded from seven healthy subjects. The threshold approach is used to classify EEG signals, with the feature employed being the amplitude of the EEG signal. The highest threshold value for the blinking signal was 0.6 mV with an accuracy of 97.9%, while the best threshold value for jaw contraction was 0.4 mV with an accuracy of 93.34 percent. The healthy, inexperienced participants took part in system testing. The total results of testing each robot movement yielded an overall success rate of 84.52 percent. Therefore, it was determined that the system could facilitate the operation of the length robot even if the user lacked prior experience with EEG-based systems.","PeriodicalId":148401,"journal":{"name":"2023 8th Asia-Pacific Conference on Intelligent Robot Systems (ACIRS)","volume":"105 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-07-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114628215","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Enhancing Biomedical Education with Real-Time Object Identification through Augmented Reality","authors":"Dinali Nelushi Jayawardana, Ruth Agada, Jie Yan","doi":"10.1109/ACIRS58671.2023.10239781","DOIUrl":"https://doi.org/10.1109/ACIRS58671.2023.10239781","url":null,"abstract":"This paper introduces a mobile augmented reality (AR) platform aimed at training first and second-year STEM majors in college, specifically focusing on the identification and usage of laboratory instruments. The platform addresses the challenges faced by students in introductory STEM courses and utilizes computer vision and AR technologies to create an engaging and learner-centric teaching environment. The platform incorporates features such as a knowledge test, guided navigation using Google Map API, and the ability to build a library of laboratory instruments through augmented reality. Developed using the Unity 3D Game engine and ARkit-XR plugin, the platform currently identifies six laboratory instruments in the biology department, with plans for expansion. The target audience comprises freshman biology majoring and minoring students at the university. The application will be integrated as a pre-lab activity during the semester, requiring the use of specialized equipment. The effectiveness of the platform will be evaluated through a user study conducted within introductory biology courses. This mobile AR platform caters to a specific market niche, addressing the existing resource gap at the university and aiming to enhance the learning experience for biomedical students. Future research will focus on expanding the platform to include chemistry and physics labs while incorporating additional functionalities to deepen students' understanding of laboratory instruments and improve the overall learning environment.","PeriodicalId":148401,"journal":{"name":"2023 8th Asia-Pacific Conference on Intelligent Robot Systems (ACIRS)","volume":"37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-07-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116685843","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Hydrodynamics Simulation of a Dual Fin Propelled Biomimetic Amphibious Robot","authors":"Minghai Xia, Qian Yin, Qunwei Zhu, Shanjun Chen, Jianzhong Shang, Zirong Luo","doi":"10.1109/ACIRS58671.2023.10240298","DOIUrl":"https://doi.org/10.1109/ACIRS58671.2023.10240298","url":null,"abstract":"Biomimetic robots have great advantages in terms of flexibility, efficiency, and maneuverability. In this paper, a novel amphibious robot which mimics the undulation motion of stingrays and snakes are proposed. The robot is able to swim underwater and walk on land by a pair of undulating fins. The structure of the robot is designed and the principle of locomotion method are described. The calculation platform is established. And the dynamic mesh method for computational fluid dynamics simulation is outlined. Underwater motion simulation is conducted in surging, steering and in-situ rotation patterns. The results show that the robot is capable of multimodal locomotion by the coordination of two fins.","PeriodicalId":148401,"journal":{"name":"2023 8th Asia-Pacific Conference on Intelligent Robot Systems (ACIRS)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-07-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134261930","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Carlos Janampa-Paitan, Jose Elias Flores-Llallico, Brayan Freddy Orellana-Garcia, Herbert Antonio Vilchez-Baca
{"title":"Automated Sorting System for Tahiti Lemons Using Raspberry PI","authors":"Carlos Janampa-Paitan, Jose Elias Flores-Llallico, Brayan Freddy Orellana-Garcia, Herbert Antonio Vilchez-Baca","doi":"10.1109/ACIRS58671.2023.10239685","DOIUrl":"https://doi.org/10.1109/ACIRS58671.2023.10239685","url":null,"abstract":"According to FreshFruit in the period 2023 the production of Tahitian lemons was 13 515 tons for USD 14 million being one of the most important citrus fruits in the Peruvian economy. This work develops an automated system for the classification of Tahiti lemons by size and maturity grade from green to yellow. For the simulation of the classification of Tahiti lemons by size from 3 to 6 cm, the bottleneck was determined by direct observation and then the mechanical system was programmed for the classification by size and degree of maturity through the detection of color by HSV to classify Tahiti lemons greater than 10% damage or by maturity, and then obtain the simulation in Factory IO and TIA PORTAL with connection to PLC S7-1200 1214 DC/DC/DC and a HMI TP700. Finally, the grading proposal was implemented in which 100% of the Tahiti lemons were recognized and packed for export through the interactive HMI screen, being able to classify and count them automatically, which has a graphical environment for the operator to manipulate them.","PeriodicalId":148401,"journal":{"name":"2023 8th Asia-Pacific Conference on Intelligent Robot Systems (ACIRS)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-07-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129367209","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Research on Robot Accuracy Compensation Method Based on Modified Grey Wolf Algorithm","authors":"Tianchen Peng, Tao Zhang, Zejun Sun","doi":"10.1109/ACIRS58671.2023.10239812","DOIUrl":"https://doi.org/10.1109/ACIRS58671.2023.10239812","url":null,"abstract":"This paper proposes a method using the modified grey wolf algorithm for optimizing robot motion accuracy to address problems of insufficient robot trajectory accuracy and low efficiency of traditional optimization algorithms. First, the Denavit-Hartenberg method is used to establish a robotics kinematic error model. Considering the parameters for optimization in the model as variables in the system, the problem of improving the accuracy of the robot is transformed into a problem of optimization for a nonlinear system. An objective function is designed according to the robot's trajectory it will be solved by the MGWO (modified grey wolf) algorithm to obtain the optimal parameters of the robot in order to improve the positioning accuracy of the robot. The experimental results show that this method is effective and can effectively reduce the robot motion error and improve positioning accuracy after algorithm optimization.","PeriodicalId":148401,"journal":{"name":"2023 8th Asia-Pacific Conference on Intelligent Robot Systems (ACIRS)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-07-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124496578","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"UUV Target Tracking Path Planning Algorithm Based on Deep Reinforcement Learning","authors":"You Yue, Wang Hao, Guanjie Hao, Yao Yao","doi":"10.1109/ACIRS58671.2023.10240259","DOIUrl":"https://doi.org/10.1109/ACIRS58671.2023.10240259","url":null,"abstract":"Path planning is one of the basic key problems in UUV task planning research. This paper studies the UUV path planning method in target tracking task scenario. The target is in a moving state, the moving elements are uncertain, and the traditional path planning algorithm is not applicable or easy to fall into the local optimal solution. In this paper, a tracing path planning algorithm based on deep reinforcement learning is presented, and a network parameter update method combining soft update with optimal sample training is proposed in the target network update link. The simulation results show that the algorithm can accelerate the network convergence speed while guaranteeing the stability of the learning process, and can quickly plan the optimal trajectory and maximize the time to track the target after UUV finds the target.","PeriodicalId":148401,"journal":{"name":"2023 8th Asia-Pacific Conference on Intelligent Robot Systems (ACIRS)","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-07-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125348178","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The Application and Inspiration of Robots in the US Military","authors":"Chenggong Zhai, Pinghua Zhang, Hongsi Xu, Xingguang Yuan, Liyong Zhou, Rangmin Wu","doi":"10.1109/ACIRS58671.2023.10239710","DOIUrl":"https://doi.org/10.1109/ACIRS58671.2023.10239710","url":null,"abstract":"With the widespread application of unmanned equipment on the battlefield, research on unmanned autonomous technology has entered a rapid development stage. The application of unmanned combat systems is becoming increasingly widespread, and various types of military robots are emerging in large numbers. These military robots with higher intelligence, more flexible movements, and faster reactions are moving from behind the scenes to the front stage of war. This article analyzes the basic situation and development process of military robots, lists typical US military robots and their development situation, and proposes the development prospects of military robots. The current application of robotics technology in military warfare is not very widespread, but with the continuous development of science and technology, robots will play an irreplaceable and significant role in the future battlefield.","PeriodicalId":148401,"journal":{"name":"2023 8th Asia-Pacific Conference on Intelligent Robot Systems (ACIRS)","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-07-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127793352","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}