M. A. Mallouh, W. Araydah, Basel Jouda, M. Al-Khawaldeh
{"title":"Comparative Modeling Study of Pneumatic Artificial Muscle Using Neural Networks, ANFIS and Curve Fitting","authors":"M. A. Mallouh, W. Araydah, Basel Jouda, M. Al-Khawaldeh","doi":"10.1109/ICARA56516.2023.10125812","DOIUrl":"https://doi.org/10.1109/ICARA56516.2023.10125812","url":null,"abstract":"Pneumatic Artificial Muscles (PAMs) are widely used in the fields of biorobots and medicine due to their flexibility, safe usage, lack of mechanical wear, low cost of manufacturing, and high ratio of power to weight. Obtaining an accurate PAM model is crucial for building a controller that obtains the required performance specifications. This study aims to create various models for a PAM and to evaluate them with respect to their accuracy in reflecting PAM behavior. An experimental-based modeling approach was adopted to collect the necessary data in order to accurately model the PAM. The data were collected for different pressure setpoints and with different loads. Four system modeling techniques were utilized: (i) curve/surface fitting, (ii) Multi-Layer Perceptron Neural Network (MLP NN), (iii) Nonlinear Auto-Regressive with eXogenous (NARX NN) and (IV) Adaptive Neuro Fuzzy Inference System (ANFIS). The analysis of the four developed models showed that the performance of the MLP NN model exceeded all other models by having the smallest error. Therefore, a simple feedforward neural network can represent the complex muscle system compared to other complex modeling techniques.","PeriodicalId":443572,"journal":{"name":"2023 9th International Conference on Automation, Robotics and Applications (ICARA)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-02-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130464675","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Cable-Driven Robotic Eye for Understanding Eye-Movement Control","authors":"A. John, A. Opstal, Alexandre Bernardino","doi":"10.1109/ICARA56516.2023.10126021","DOIUrl":"https://doi.org/10.1109/ICARA56516.2023.10126021","url":null,"abstract":"We propose a design for a bio-inspired robotic eye, with 6 independently controlled muscles, that is suitable for studying the emergence of human saccadic eye movements char-acteristics. Understanding how characteristics like the restriction of eye orientations to a 2D manifold, straight saccadic trajecto-ries, and saturating relationship between saccade amplitude and its peak velocity come about in a highly nonlinear system with non-commutativity of rotations is not trivial. Although earlier studies have addressed some of these problems, none have so far considered the full 3D complexity of ocular kinematics and dynamics. Our design contains a spherical eye actuated by six motor-driven cables with realistic pulling directions to mimic the six extraocular muscles. The coupling between the eyeball and eye socket has been designed to specify a damped rotational system, which is key to understanding the signals involved in the control of artificial and biological eyes. We present the mechanical design of the robotic system and a simulation model based on it. The system has a large range of movement and its dynamical responses to step inputs are shown, thus illustrating its ability to perform a wide range of eye movements with the appropriate characteristics.","PeriodicalId":443572,"journal":{"name":"2023 9th International Conference on Automation, Robotics and Applications (ICARA)","volume":"97 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-02-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134336807","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Discrete-time Distributed Optimization Algorithm for Multi-robot Coordination Target Monitor","authors":"Yanling Zheng, Qingshan Liu, Guoyi Chi","doi":"10.1109/ICARA56516.2023.10125641","DOIUrl":"https://doi.org/10.1109/ICARA56516.2023.10125641","url":null,"abstract":"In this paper, the task of multi-robot coordination monitor as an optimization problem is formulated. The whole cost function consists of the sum of local cost functions for each robot to evaluate the best location. To encircle the target, a global equality constraint is introduced, and convex sets are built for the feasibility constraints of robots' location. Then, a distributed discrete-time algorithm is developed for the task of multi-robot coordination monitor, and it is also proven to converge to an optimal solution of the established optimization problem under certain initial restriction. Finally, a numerical simulation shows the effectiveness of the proposed distributed optimization approach.","PeriodicalId":443572,"journal":{"name":"2023 9th International Conference on Automation, Robotics and Applications (ICARA)","volume":"37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-02-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134051576","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Autonomous Navigation of Quadrotors Using Tactile Feedback","authors":"N. Borkar, P. Krishnamurthy, A. Tzes, F. Khorrami","doi":"10.1109/ICARA56516.2023.10125600","DOIUrl":"https://doi.org/10.1109/ICARA56516.2023.10125600","url":null,"abstract":"In this paper, we present a novel approach for autonomous navigation of quadrotors in complex unknown environments using tactile feedback. The approach uses an array of force/contact sensors on the quadrotor to determine local obstacle geometry and follow contours of sensed objects. The approach is particularly useful in scenarios where visibility is limited, such as in dark or smoky/foggy conditions, in which vision-based navigation is not possible. To show the efficacy of the proposed approach, we perform simulation studies in a variety of environments and demonstrate that the quadrotor is able to autonomously navigate without any a priori knowledge of the environment and without relying upon any vision-aided sensing of the environment.","PeriodicalId":443572,"journal":{"name":"2023 9th International Conference on Automation, Robotics and Applications (ICARA)","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-02-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122538114","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Online Surveying System for Experimentally Testing the Human Perception of Visual Gestures","authors":"Márk Domonkos, Ádám Tresó, János Botzheim","doi":"10.1109/ICARA56516.2023.10125860","DOIUrl":"https://doi.org/10.1109/ICARA56516.2023.10125860","url":null,"abstract":"In this paper, we would like to emphasize the need for an intuitive and easy-to-understand way of communication during a Human-Robot Collaboration (HRC) mainly in industrial scenarios. With the new communication design, the mental demands of the human workforce during collaboration can be lowered by the feedback given by the robot in a situation-aware way. This kind of feedback in close cooperation can maintain high importance in a manufacturing scenario. Another goal of this paper is to present the progress of former research that similarly dealt with visual signals during HRC. The goal during the design of the proposed novel methodology was to make the research of visual gestures in Human-Robot Interactions more effective and flexible. To address these demands an online surveying application is introduced and an initial proof of concept nature test was also conducted. During the investigation, we introduced emotional states in the test as a supporting modality for later use. From the analysis, we concluded that visual signals do have properties that can affect the perception of the viewer.","PeriodicalId":443572,"journal":{"name":"2023 9th International Conference on Automation, Robotics and Applications (ICARA)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-02-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127929815","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Carlos Gutiérrez-Álvarez, Sergio Hernández-García, Nadia Nasri, Alfredo Cuesta-Infante, R. López-Sastre
{"title":"Towards Clear Evaluation of Robotic Visual Semantic Navigation","authors":"Carlos Gutiérrez-Álvarez, Sergio Hernández-García, Nadia Nasri, Alfredo Cuesta-Infante, R. López-Sastre","doi":"10.1109/ICARA56516.2023.10125866","DOIUrl":"https://doi.org/10.1109/ICARA56516.2023.10125866","url":null,"abstract":"In this paper we address the problem of visual semantic navigation (VSN), in which a robot needs to navigate through an environment to reach an object having only access to egocentric RGB perception sensors. This is a recently explored problem, where most of the approaches leverage last advances in deep learning models for visual perception, combined with reinforcement learning (RL) strategies. Nonetheless, after a review of the literature, it is complicated to perform direct comparisons between the different solutions. The main difficulties lie in the fact that the navigation environments in which the experimental metrics are reported are not accessible, and each approach uses different RL libraries. In this paper, we release a publicly available experimental setup for the VSN problem, with the aim of providing a clear benchmark. It has been constructed using pyRIL, an open source python library for RL, and two navigation environments: Miniwolrd-Maze from gym-miniworld, and one 3D scene from HM3D dataset using AI Habitat simulator. We finally propose a state-of-the-art VSN model, consisting in a Contrastive Language Image Pretraining (CLIP) visual encoder plus a set of two recurrent neural networks for producing the discrete navigation actions. This model is evaluated in the proposed experimental setup, with a careful analysis of the main VSN challenges, namely: the sparse rewards problem; and the exploitation-exploration trade-off. Code is available at: https://github.com/gramuah/vsn.","PeriodicalId":443572,"journal":{"name":"2023 9th International Conference on Automation, Robotics and Applications (ICARA)","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-02-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133101588","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Model-Based Approach for Remote Development of Embedded Software for Object Avoidance Applications","authors":"R. Beneder, Patrick Schmitt, Clemens Környefalvy","doi":"10.1109/ICARA56516.2023.10125627","DOIUrl":"https://doi.org/10.1109/ICARA56516.2023.10125627","url":null,"abstract":"The research and development of digital control sys-tems, embedded software and precise sensor measurement within the field of automated and autonomous robotics applications has increased significantly within the last decade. Based on these developments very complex, compute-intense tasks with real-time constraints in combination with Artificial Intelligence capabilities prepared the way for a new application field - so called “new aviation”. Within this research field various topics of robotics, embedded systems, power systems and vision systems play an important role. Moreover, the system developers and researchers within this industry need a profound knowledge in every technical discipline. This paper mainly focuses on tasks within the field of indoor navigation. This area of application can be utilized for logistics, maintenance and service tasks. The most important topic of indoor navigation tasks is the location determination and the orientation with no GPS position available. This paper introduces a model-based approach for remote development of embedded software for indoor object avoidance applications. This model-based approach helps to reduces the complexity of the implementation of multicore microcontroller applications with real-time constraints, which post-processes vision sensor data and utilizes them to automatically and au-tonomously orientate itself within an indoor environment.","PeriodicalId":443572,"journal":{"name":"2023 9th International Conference on Automation, Robotics and Applications (ICARA)","volume":"7 9","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-02-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120808691","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Halil Utku Unlu, Dimitris Chaikalis, Athanasios Tsoukalas, A. Tzes
{"title":"UAV- Navigation Using Map Slicing and Safe Path Computation","authors":"Halil Utku Unlu, Dimitris Chaikalis, Athanasios Tsoukalas, A. Tzes","doi":"10.1109/ICARA56516.2023.10125893","DOIUrl":"https://doi.org/10.1109/ICARA56516.2023.10125893","url":null,"abstract":"This paper is concerned with the safe path planning of a drone while exploring an unknown space. The drone is localized by fusing measurements from sensors including an IMU, RGB-D sensor, and an optical flow system, while executing an RTAB-Map SLAM algorithm. The 3D-occupancy Octomap is generated online and a slicing algorithm is employed to compute 2D-maps. The maps' traversible coordinates are identified and used as potential points for the drone intermediate navigation to the destination. The final segment corresponds to a shortest Chebyshev-length path between all frontier pixels and the endpoint over the unexplored map region. The drone's path is computed using a skeletal path between the identified map boundaries so that the drone moves from its current location through the free map coordinates to the destination point. Simulation studies using within the interior of an apartment indicate the efficiency and effectiveness of the proposed method.","PeriodicalId":443572,"journal":{"name":"2023 9th International Conference on Automation, Robotics and Applications (ICARA)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-02-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125094406","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Studying Worker Perceptions on Safety, Autonomy, and Job Security in Human-Robot Collaboration","authors":"Gurpreet Kaur, Sean Banerjee, N. Banerjee","doi":"10.1109/ICARA56516.2023.10125842","DOIUrl":"https://doi.org/10.1109/ICARA56516.2023.10125842","url":null,"abstract":"We present a study on analyzing worker perceptions of safety, autonomy, and job security in collaborative environments where human co-workers and robots are expected to offer workers varying levels of collaborative assistance. With the rise in robotization, workers in blue-collar environments face the risk of being displaced. Recent studies suggest that despite showing concern for displacement, workers do see benefits of robots in the workplace, especially ones that collaborate with humans. We survey worker perceptions toward robots that offer varying levels of collaborative assistance-fully interventional or always assistive, fully standoff or never directly assistive, and assistive on an as-needed basis. We administer questionnaire-based surveys to N=530 blue-collar workers in companies spanning construction, contract work, manufacturing, retail, transportation and delivery, and warehousing in 4 countries. To understand the impact of corobots in promoting inclusivity, we break down our analysis in terms of age and sex. Our study shows that robots that provide as-needed assistance are viewed more favorably in terms of preserving autonomy and job security than fully interventional or fully standoff robots, and viewed more positively amongst female and older workers, demonstrating their potential to promote inclusivity and alleviate job displacement concerns.","PeriodicalId":443572,"journal":{"name":"2023 9th International Conference on Automation, Robotics and Applications (ICARA)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-02-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124333591","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Design and Feasibility Test of an Automatic Scraping Robot","authors":"Zhenmeng Cui, Liang Han, Guancheng Dong, Yingze Lin, Yangzhen Gao, Shuaishuai Fan","doi":"10.1109/ICARA56516.2023.10125772","DOIUrl":"https://doi.org/10.1109/ICARA56516.2023.10125772","url":null,"abstract":"Scraping is a key technology in high-precision machine tool machining. Scraping can eliminate the accumulated tolerances and improve the assembly accuracy of the machine tool. Scraping is a time-consuming and tedious manual labor, which is usually conducted by experienced technician. To overcome these shortcomings, a novel automatic scraping robot was designed and tested in this study. The robot includes a 3-axis moving mechanism, a vision recognition system, a 3-D measurement system, and a control system. In this study, milling is used to simulate the shape of tool marks in the traditional scraping process. After a series of tests, the designed robot has been verified to be able to perform automatic scraping work. The workpiece scraped with this robot meets the standard of a high precision machine.","PeriodicalId":443572,"journal":{"name":"2023 9th International Conference on Automation, Robotics and Applications (ICARA)","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-02-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129567435","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}