Frontiers in Robotics and AI最新文献

筛选
英文 中文
Advancements in the use of AI in the diagnosis and management of inflammatory bowel disease. 人工智能在诊断和治疗炎症性肠病方面的应用进展。
IF 2.9
Frontiers in Robotics and AI Pub Date : 2024-10-21 eCollection Date: 2024-01-01 DOI: 10.3389/frobt.2024.1453194
Dalia Braverman-Jaiven, Luigi Manfredi
{"title":"Advancements in the use of AI in the diagnosis and management of inflammatory bowel disease.","authors":"Dalia Braverman-Jaiven, Luigi Manfredi","doi":"10.3389/frobt.2024.1453194","DOIUrl":"10.3389/frobt.2024.1453194","url":null,"abstract":"<p><p>Inflammatory bowel disease (IBD) causes chronic inflammation of the colon and digestive tract, and it can be classified as Crohn's disease (CD) and Ulcerative colitis (UC). IBD is more prevalent in Europe and North America, however, since the beginning of the 21st century it has been increasing in South America, Asia, and Africa, leading to its consideration as a worldwide problem. Optical colonoscopy is one of the crucial tests in diagnosing and assessing the progression and prognosis of IBD, as it allows a real-time optical visualization of the colonic wall and ileum and allows for the collection of tissue samples. The accuracy of colonoscopy procedures depends on the expertise and ability of the endoscopists. Therefore, algorithms based on Deep Learning (DL) and Convolutional Neural Networks (CNN) for colonoscopy images and videos are growing in popularity, especially for the detection and classification of colorectal polyps. The performance of this system is dependent on the quality and quantity of the data used for training. There are several datasets publicly available for endoscopy images and videos, but most of them are solely specialized in polyps. The use of DL algorithms to detect IBD is still in its inception, most studies are based on assessing the severity of UC. As artificial intelligence (AI) grows in popularity there is a growing interest in the use of these algorithms for diagnosing and classifying the IBDs and managing their progression. To tackle this, more annotated colonoscopy images and videos will be required for the training of new and more reliable AI algorithms. This article discusses the current challenges in the early detection of IBD, focusing on the available AI algorithms, and databases, and the challenges ahead to improve the detection rate.</p>","PeriodicalId":47597,"journal":{"name":"Frontiers in Robotics and AI","volume":"11 ","pages":"1453194"},"PeriodicalIF":2.9,"publicationDate":"2024-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11532194/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142577084","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Simulation and real-life implementation of UAV autonomous landing system based on object recognition and tracking for safe landing in uncertain environments. 基于物体识别和跟踪的无人机自主着陆系统的仿真和实际应用,以实现在不确定环境中的安全着陆。
IF 2.9
Frontiers in Robotics and AI Pub Date : 2024-10-18 eCollection Date: 2024-01-01 DOI: 10.3389/frobt.2024.1450266
Ranjai Baidya, Heon Jeong
{"title":"Simulation and real-life implementation of UAV autonomous landing system based on object recognition and tracking for safe landing in uncertain environments.","authors":"Ranjai Baidya, Heon Jeong","doi":"10.3389/frobt.2024.1450266","DOIUrl":"https://doi.org/10.3389/frobt.2024.1450266","url":null,"abstract":"<p><p>The use of autonomous Unmanned Aerial Vehicles (UAVs) has been increasing, and the autonomy of these systems and their capabilities in dealing with uncertainties is crucial. Autonomous landing is pivotal for the success of an autonomous mission of UAVs. This paper presents an autonomous landing system for quadrotor UAVs with the ability to perform smooth landing even in undesirable conditions like obstruction by obstacles in and around the designated landing area and inability to identify or the absence of a visual marker establishing the designated landing area. We have integrated algorithms like version 5 of You Only Look Once (YOLOv5), DeepSORT, Euclidean distance transform, and Proportional-Integral-Derivative (PID) controller to strengthen the robustness of the overall system. While the YOLOv5 model is trained to identify the visual marker of the landing area and some common obstacles like people, cars, and trees, the DeepSORT algorithm keeps track of the identified objects. Similarly, using the detection of the identified objects and Euclidean distance transform, an open space without any obstacles to land could be identified if necessary. Finally, the PID controller generates appropriate movement values for the UAV using the visual cues of the target landing area and the obstacles. To warrant the validity of the overall system without risking the safety of the involved people, initial tests are performed, and a software-based simulation is performed before executing the tests in real life. A full-blown hardware system with an autonomous landing system is then built and tested in real life. The designed system is tested in various scenarios to verify the effectiveness of the system. The code is available at this repository: https://github.com/rnjbdya/Vision-based-UAV-autonomous-landing.</p>","PeriodicalId":47597,"journal":{"name":"Frontiers in Robotics and AI","volume":"11 ","pages":"1450266"},"PeriodicalIF":2.9,"publicationDate":"2024-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11551718/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142630038","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Remote science at sea with remotely operated vehicles. 利用遥控飞行器进行海上遥感科学研究。
IF 2.9
Frontiers in Robotics and AI Pub Date : 2024-10-18 eCollection Date: 2024-01-01 DOI: 10.3389/frobt.2024.1454923
Zara Mirmalek, Nicole A Raineault
{"title":"Remote science at sea with remotely operated vehicles.","authors":"Zara Mirmalek, Nicole A Raineault","doi":"10.3389/frobt.2024.1454923","DOIUrl":"10.3389/frobt.2024.1454923","url":null,"abstract":"<p><p>Conducting sea-going ocean science no longer needs to be limited to the number of berths on a ship given that telecommunications, computing, and networking technologies onboard ships have become familiar mechanisms for expanding scientists' reach from onshore. The oceanographic community routinely works with remotely operated vehicles (ROVs) and pilots to access real-time video and data from the deep sea, while onboard a ship. The extension of using an ROV and its host vessel's live-streaming capabilities has been popularized for almost 3 decades as a telepresence technology. Telepresence-enabled vessels with ROVs have been employed for science, education, and outreach, giving a greater number of communities viewing access to ocean science. However, the slower development of technologies and social processes enabling sustained real-time involvement between scientists on-ship and onshore undermines the potential for broader access, which limits the possibility of increasing inclusivity and discoveries through a diversity of knowledge and capabilities. This article reviews ocean scientists' use of telepresence for ROV-based deep-sea research and funded studies of telepresence capabilities. The authors summarize these studies findings and conditions that lead to defining the use of telepresence-enabled vessels for \"remote science at sea.\" Authors define remote science at sea as a type of ocean expedition, an additional capability, not a replacement for all practices by which scientists conduct ocean research. Remote science for ocean research is an expedition at-sea directed by a distributed science team working together from at least two locations (on-ship and onshore) to complete their science objectives for which primary data is acquired by robotic technologies, with connectivity supported by a high-bandwidth satellite and the telepresence-enabled ship's technologies to support the science team actively engaged before, during, and after dives across worksites. The growth of productive ocean expeditions with remote science is met with social, technical, and logistical challenges that impede the ability of remote scientists to succeed. In this article, authors review telepresence-enabled ocean science, define and situate the adjoined model of remote science at sea, and some infrastructural, technological and social considerations for conducting and further developing remote science at sea.</p>","PeriodicalId":47597,"journal":{"name":"Frontiers in Robotics and AI","volume":"11 ","pages":"1454923"},"PeriodicalIF":2.9,"publicationDate":"2024-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11527704/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142570039","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A pipeline for estimating human attention toward objects with on-board cameras on the iCub humanoid robot. 利用 iCub 人形机器人上的板载摄像头估算人类对物体注意力的管道。
IF 2.9
Frontiers in Robotics and AI Pub Date : 2024-10-17 eCollection Date: 2024-01-01 DOI: 10.3389/frobt.2024.1346714
Shiva Hanifi, Elisa Maiettini, Maria Lombardi, Lorenzo Natale
{"title":"A pipeline for estimating human attention toward objects with on-board cameras on the iCub humanoid robot.","authors":"Shiva Hanifi, Elisa Maiettini, Maria Lombardi, Lorenzo Natale","doi":"10.3389/frobt.2024.1346714","DOIUrl":"10.3389/frobt.2024.1346714","url":null,"abstract":"<p><p>This research report introduces a learning system designed to detect the object that humans are gazing at, using solely visual feedback. By incorporating face detection, human attention prediction, and online object detection, the system enables the robot to perceive and interpret human gaze accurately, thereby facilitating the establishment of joint attention with human partners. Additionally, a novel dataset collected with the humanoid robot iCub is introduced, comprising more than 22,000 images from ten participants gazing at different annotated objects. This dataset serves as a benchmark for human gaze estimation in table-top human-robot interaction (HRI) contexts. In this work, we use it to assess the proposed pipeline's performance and examine each component's effectiveness. Furthermore, the developed system is deployed on the iCub and showcases its functionality. The results demonstrate the potential of the proposed approach as a first step to enhancing social awareness and responsiveness in social robotics. This advancement can enhance assistance and support in collaborative scenarios, promoting more efficient human-robot collaborations.</p>","PeriodicalId":47597,"journal":{"name":"Frontiers in Robotics and AI","volume":"11 ","pages":"1346714"},"PeriodicalIF":2.9,"publicationDate":"2024-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11524796/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142559143","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Leveraging imitation learning in agricultural robotics: a comprehensive survey and comparative analysis. 农业机器人中的模仿学习:全面调查与比较分析。
IF 2.9
Frontiers in Robotics and AI Pub Date : 2024-10-17 eCollection Date: 2024-01-01 DOI: 10.3389/frobt.2024.1441312
Siavash Mahmoudi, Amirreza Davar, Pouya Sohrabipour, Ramesh Bahadur Bist, Yang Tao, Dongyi Wang
{"title":"Leveraging imitation learning in agricultural robotics: a comprehensive survey and comparative analysis.","authors":"Siavash Mahmoudi, Amirreza Davar, Pouya Sohrabipour, Ramesh Bahadur Bist, Yang Tao, Dongyi Wang","doi":"10.3389/frobt.2024.1441312","DOIUrl":"10.3389/frobt.2024.1441312","url":null,"abstract":"<p><p>Imitation learning (IL), a burgeoning frontier in machine learning, holds immense promise across diverse domains. In recent years, its integration into robotics has sparked significant interest, offering substantial advancements in autonomous control processes. This paper presents an exhaustive insight focusing on the implementation of imitation learning techniques in agricultural robotics. The survey rigorously examines varied research endeavors utilizing imitation learning to address pivotal agricultural challenges. Methodologically, this survey comprehensively investigates multifaceted aspects of imitation learning applications in agricultural robotics. The survey encompasses the identification of agricultural tasks that can potentially be addressed through imitation learning, detailed analysis of specific models and frameworks, and a thorough assessment of performance metrics employed in the surveyed studies. Additionally, it includes a comparative analysis between imitation learning techniques and conventional control methodologies in the realm of robotics. The findings derived from this survey unveil profound insights into the applications of imitation learning in agricultural robotics. These methods are highlighted for their potential to significantly improve task execution in dynamic and high-dimensional action spaces prevalent in agricultural settings, such as precision farming. Despite promising advancements, the survey discusses considerable challenges in data quality, environmental variability, and computational constraints that IL must overcome. The survey also addresses the ethical and social implications of implementing such technologies, emphasizing the need for robust policy frameworks to manage the societal impacts of automation. These findings hold substantial implications, showcasing the potential of imitation learning to revolutionize processes in agricultural robotics. This research significantly contributes to envisioning innovative applications and tools within the agricultural robotics domain, promising heightened productivity and efficiency in robotic agricultural systems. It underscores the potential for remarkable enhancements in various agricultural processes, signaling a transformative trajectory for the sector, particularly in the realm of robotics and autonomous systems.</p>","PeriodicalId":47597,"journal":{"name":"Frontiers in Robotics and AI","volume":"11 ","pages":"1441312"},"PeriodicalIF":2.9,"publicationDate":"2024-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11524802/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142559144","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Novel bio-inspired soft actuators for upper-limb exoskeletons: design, fabrication and feasibility study. 用于上肢外骨骼的新型生物启发软致动器:设计、制造和可行性研究。
IF 2.9
Frontiers in Robotics and AI Pub Date : 2024-10-16 eCollection Date: 2024-01-01 DOI: 10.3389/frobt.2024.1451231
Haiyun Zhang, Gabrielle Naquila, Junghyun Bae, Zonghuan Wu, Ashwin Hingwe, Ashish Deshpande
{"title":"Novel bio-inspired soft actuators for upper-limb exoskeletons: design, fabrication and feasibility study.","authors":"Haiyun Zhang, Gabrielle Naquila, Junghyun Bae, Zonghuan Wu, Ashwin Hingwe, Ashish Deshpande","doi":"10.3389/frobt.2024.1451231","DOIUrl":"10.3389/frobt.2024.1451231","url":null,"abstract":"<p><p>Soft robots have been increasingly utilized as sophisticated tools in physical rehabilitation, particularly for assisting patients with neuromotor impairments. However, many soft robotics for rehabilitation applications are characterized by limitations such as slow response times, restricted range of motion, and low output force. There are also limited studies on the precise position and force control of wearable soft actuators. Furthermore, not many studies articulate how bellow-structured actuator designs quantitatively contribute to the robots' capability. This study introduces a paradigm of upper limb soft actuator design. This paradigm comprises two actuators: the Lobster-Inspired Silicone Pneumatic Robot (LISPER) for the elbow and the Scallop-Shaped Pneumatic Robot (SCASPER) for the shoulder. LISPER is characterized by higher bandwidth, increased output force/torque, and high linearity. SCASPER is characterized by high output force/torque and simplified fabrication processes. Comprehensive analytical models that describe the relationship between pressure, bending angles, and output force for both actuators were presented so the geometric configuration of the actuators can be set to modify the range of motion and output forces. The preliminary test on a dummy arm is conducted to test the capability of the actuators.</p>","PeriodicalId":47597,"journal":{"name":"Frontiers in Robotics and AI","volume":"11 ","pages":"1451231"},"PeriodicalIF":2.9,"publicationDate":"2024-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11521781/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142548289","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Corrigendum: SonoBox: development of a robotic ultrasound tomograph for the ultrasound diagnosis of paediatric forearm fractures. 更正:SonoBox:开发用于儿科前臂骨折超声诊断的机器人超声断层显像仪。
IF 2.9
Frontiers in Robotics and AI Pub Date : 2024-10-15 eCollection Date: 2024-01-01 DOI: 10.3389/frobt.2024.1505171
Floris Ernst, Jonas Osburg, Ludger Tüshaus
{"title":"Corrigendum: SonoBox: development of a robotic ultrasound tomograph for the ultrasound diagnosis of paediatric forearm fractures.","authors":"Floris Ernst, Jonas Osburg, Ludger Tüshaus","doi":"10.3389/frobt.2024.1505171","DOIUrl":"https://doi.org/10.3389/frobt.2024.1505171","url":null,"abstract":"<p><p>[This corrects the article DOI: 10.3389/frobt.2024.1405169.].</p>","PeriodicalId":47597,"journal":{"name":"Frontiers in Robotics and AI","volume":"11 ","pages":"1505171"},"PeriodicalIF":2.9,"publicationDate":"2024-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11518681/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142548288","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Psychophysics of user acceptance of social cyber-physical systems. 用户接受社会网络物理系统的心理物理学。
IF 2.9
Frontiers in Robotics and AI Pub Date : 2024-10-15 eCollection Date: 2024-01-01 DOI: 10.3389/frobt.2024.1414853
Maya Dimitrova, Neda Chehlarova, Anastas Madzharov, Aleksandar Krastev, Ivan Chavdarov
{"title":"Psychophysics of user acceptance of social cyber-physical systems.","authors":"Maya Dimitrova, Neda Chehlarova, Anastas Madzharov, Aleksandar Krastev, Ivan Chavdarov","doi":"10.3389/frobt.2024.1414853","DOIUrl":"https://doi.org/10.3389/frobt.2024.1414853","url":null,"abstract":"<p><p>A mini-review of the literature, supporting the view on the psychophysical origins of some user acceptance effects of cyber-physical systems (CPSs), is presented and discussed in this paper. Psychophysics implies the existence of a lawful functional dependence between some aspect/dimension of the stimulation from the environment, entering the senses of the human, and the psychological effect that is being produced by this stimulation, as reflected in the subjective responses. Several psychophysical models are discussed in this mini-review, aiming to support the view that the observed effects of reactance to a robot or the uncanny valley phenomenon are essentially the same subjective effects of different intensity. Justification is provided that human responses to technologically and socially ambiguous stimuli obey some regularity, which can be considered a lawful dependence in a psychophysical sense. The main conclusion is based on the evidence that psychophysics can provide useful and helpful, as well as parsimonious, design recommendations for scenarios with CPSs for social applications.</p>","PeriodicalId":47597,"journal":{"name":"Frontiers in Robotics and AI","volume":"11 ","pages":"1414853"},"PeriodicalIF":2.9,"publicationDate":"2024-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11519208/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142548290","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Targeted weed management of Palmer amaranth using robotics and deep learning (YOLOv7). 利用机器人技术和深度学习对帕尔默苋进行有针对性的杂草管理(YOLOv7)。
IF 2.9
Frontiers in Robotics and AI Pub Date : 2024-10-14 eCollection Date: 2024-01-01 DOI: 10.3389/frobt.2024.1441371
Amlan Balabantaray, Shaswati Behera, CheeTown Liew, Nipuna Chamara, Mandeep Singh, Amit J Jhala, Santosh Pitla
{"title":"Targeted weed management of Palmer amaranth using robotics and deep learning (YOLOv7).","authors":"Amlan Balabantaray, Shaswati Behera, CheeTown Liew, Nipuna Chamara, Mandeep Singh, Amit J Jhala, Santosh Pitla","doi":"10.3389/frobt.2024.1441371","DOIUrl":"10.3389/frobt.2024.1441371","url":null,"abstract":"<p><p>Effective weed management is a significant challenge in agronomic crops which necessitates innovative solutions to reduce negative environmental impacts and minimize crop damage. Traditional methods often rely on indiscriminate herbicide application, which lacks precision and sustainability. To address this critical need, this study demonstrated an AI-enabled robotic system, Weeding robot, designed for targeted weed management. Palmer amaranth (<i>Amaranthus palmeri S. Watson</i>) was selected as it is the most troublesome weed in Nebraska. We developed the full stack (vision, hardware, software, robotic platform, and AI model) for precision spraying using YOLOv7, a state-of-the-art object detection deep learning technique. The Weeding robot achieved an average of 60.4% precision and 62% recall in real-time weed identification and spot spraying with the developed gantry-based sprayer system. The Weeding robot successfully identified Palmer amaranth across diverse growth stages in controlled outdoor conditions. This study demonstrates the potential of AI-enabled robotic systems for targeted weed management, offering a more precise and sustainable alternative to traditional herbicide application methods.</p>","PeriodicalId":47597,"journal":{"name":"Frontiers in Robotics and AI","volume":"11 ","pages":"1441371"},"PeriodicalIF":2.9,"publicationDate":"2024-10-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11513266/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142524080","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Customisation's impact on strengthening affective bonds and decision-making with socially assistive robots. 定制对加强情感纽带和社交辅助机器人决策的影响。
IF 2.9
Frontiers in Robotics and AI Pub Date : 2024-10-14 eCollection Date: 2024-01-01 DOI: 10.3389/frobt.2024.1384610
Mohammed Shabaj Ahmed, Manuel Giuliani, Ute Leonards, Paul Bremner
{"title":"Customisation's impact on strengthening affective bonds and decision-making with socially assistive robots.","authors":"Mohammed Shabaj Ahmed, Manuel Giuliani, Ute Leonards, Paul Bremner","doi":"10.3389/frobt.2024.1384610","DOIUrl":"10.3389/frobt.2024.1384610","url":null,"abstract":"<p><p>This study aims to fill a gap in understanding how customising robots can affect how humans interact with them, specifically regarding human decision-making and robot perception. The study focused on the robot's ability to persuade participants to follow its suggestions within the Balloon Analogue Risk Task (BART), where participants were challenged to balance the risk of bursting a virtual balloon against the potential reward of inflating it further. A between-subjects design was used, involving 62 participants divided evenly between customised or non-customised robot conditions. Compliance, risk-taking, reaction time, and perceptions of the robot's likability, intelligence, trustworthiness, and ownership were measured using quantitative and qualitative methods. The results showed that there were no significant differences in compliance or risk-taking behaviours between customised and non-customised robots. However, participants in the customised condition reported a significant increase in perceived ownership. Additionally, reaction times were longer in the customised condition, particularly for the \"collect\" suggestion. These results indicate that although customisation may not directly affect compliance or risk-taking, it enhances cognitive engagement and personal connection with robots. Regardless of customisation, the presence of a robot significantly influenced risk-taking behaviours, supporting theories of over-trust in robots and the automation bias. These findings highlight the importance of carefully considering ethical design and effective communication strategies when developing socially assistive robots to manage user trust and expectations, particularly in applications where behavioural influence is involved.</p>","PeriodicalId":47597,"journal":{"name":"Frontiers in Robotics and AI","volume":"11 ","pages":"1384610"},"PeriodicalIF":2.9,"publicationDate":"2024-10-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11513520/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142523330","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信