International Journal of Social Robotics最新文献

筛选
英文 中文
Gaze-Cues of Humans and Robots on Pedestrian Ways 行人道上人类和机器人的目光提示
IF 4.7 2区 计算机科学
International Journal of Social Robotics Pub Date : 2023-12-16 DOI: 10.1007/s12369-023-01064-3
Carla S. Jakobowsky, Anna M. H. Abrams, Astrid M. Rosenthal-von der Pütten
{"title":"Gaze-Cues of Humans and Robots on Pedestrian Ways","authors":"Carla S. Jakobowsky, Anna M. H. Abrams, Astrid M. Rosenthal-von der Pütten","doi":"10.1007/s12369-023-01064-3","DOIUrl":"https://doi.org/10.1007/s12369-023-01064-3","url":null,"abstract":"<p>Delivery robots and personal cargo robots are increasingly sharing space with incidentally co-present persons (InCoPs) on pedestrian ways facing the challenge of socially adequate and safe navigation. Humans are able to effortlessly negotiate this shared space by signalling their skirting intentions via non-verbal gaze cues. In two online-experiments we investigated whether this phenomenon of gaze cuing can be transferred to human–robot interaction. In the first study, participants (<i>n</i> = 92) watched short videos in which either a human, a humanoid robot or a non-humanoid delivery robot moved towards the camera. In each video, the counterpart looked either straight towards the camera or did an eye movement to the right or left. The results showed that when the counterpart gaze cued to their left, also participants skirted more often to the left from their perspective, thereby walking past each other and avoiding collision. Since the participants were recruited in a right-hand driving country we replicated the study in left-hand driving countries (<i>n</i> = 176). Results showed that participants skirted more often to the right when the counterpart gaze cued to the right, and to the left in case of eye movements to the left, expanding our previous result. In both studies, skirting behavior did not differ regarding the type of counterpart. Hence, gaze cues increase the chance to trigger complementary skirting behavior in InCoPs independently of the robot morphology. Equipping robots with eyes can help to indicate moving direction by gaze cues and thereby improve interactions between humans and robots on pedestrian ways.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"20 1","pages":""},"PeriodicalIF":4.7,"publicationDate":"2023-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138683896","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Correction to: Autonomous Systems and Technology Resistance: New Tools for Monitoring Acceptance, Trust, and Tolerance 更正:自主系统与技术阻力:监测接受度、信任度和容忍度的新工具
IF 4.7 2区 计算机科学
International Journal of Social Robotics Pub Date : 2023-12-14 DOI: 10.1007/s12369-023-01081-2
M. Cappuccio, Jai C. Galliott, Friederike Eyssel, Alessandro Lanteri
{"title":"Correction to: Autonomous Systems and Technology Resistance: New Tools for Monitoring Acceptance, Trust, and Tolerance","authors":"M. Cappuccio, Jai C. Galliott, Friederike Eyssel, Alessandro Lanteri","doi":"10.1007/s12369-023-01081-2","DOIUrl":"https://doi.org/10.1007/s12369-023-01081-2","url":null,"abstract":"","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"150 1","pages":"1"},"PeriodicalIF":4.7,"publicationDate":"2023-12-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139180002","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Design Attributes of Socially Assistive Robots for People with Dementia: A Systematic Review 痴呆症患者社交辅助机器人的设计属性:系统回顾
IF 4.7 2区 计算机科学
International Journal of Social Robotics Pub Date : 2023-12-08 DOI: 10.1007/s12369-023-01087-w
Matthew Green, Dzung Dao, Wendy Moyle
{"title":"Design Attributes of Socially Assistive Robots for People with Dementia: A Systematic Review","authors":"Matthew Green, Dzung Dao, Wendy Moyle","doi":"10.1007/s12369-023-01087-w","DOIUrl":"https://doi.org/10.1007/s12369-023-01087-w","url":null,"abstract":"<p>Socially assistive robots (SARs) have shown promise in the care of people with dementia and in mitigating behavioural and psychological symptoms of dementia. Although SARs are continually tested for efficacy, no current literature outlines a comprehensive strategy that industrial designers may employ to progress the technology of SARs. It was, therefore, essential to expand on existing literature by providing a straightforward approach to SAR design with the recommended design attributes. A systematic review was conducted to formulate recommendations for designing SARs to improve the quality of life of people with dementia. Six databases, including CINAHL, Embase, IEEE, Medline, ProQuest, and Scopus, were searched for relevant articles published between 2011 and 2022. Covidence software was used for screening, data extraction and quality testing. Of the 160 references extracted, 16 studies met the study inclusion criteria. The studies were predominately small sample sizes using various robotic platforms and technologies. Incorporating personal preferences linked to a user’s life experience and choice is a crucial ability of SARs. Natural speech communication is also an important design attribute. However, the overwhelming conclusion is that more research is needed on aesthetics, materials, and interaction capabilities. All stakeholders should be part of a holistic user-centred design process to ensure a fit-for-purpose product.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"19 1","pages":""},"PeriodicalIF":4.7,"publicationDate":"2023-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138553206","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Does the Social Robot Nao Facilitate Cooperation in High Functioning Children with ASD? 社交机器人Nao是否有助于高功能自闭症儿童的合作?
IF 4.7 2区 计算机科学
International Journal of Social Robotics Pub Date : 2023-12-05 DOI: 10.1007/s12369-023-01063-4
Viviane Kostrubiec, Chloé Lajunta, Pierre-Vincent Paubel, Jeanne Kruck
{"title":"Does the Social Robot Nao Facilitate Cooperation in High Functioning Children with ASD?","authors":"Viviane Kostrubiec, Chloé Lajunta, Pierre-Vincent Paubel, Jeanne Kruck","doi":"10.1007/s12369-023-01063-4","DOIUrl":"https://doi.org/10.1007/s12369-023-01063-4","url":null,"abstract":"<p>We designed a coordination–cooperation game dedicated to teaching the theory of mind (ToM) to children with autism spectrum disorder. Children interacted with either a robot or a human. They had to coordinate their gestures with the beats of a ditty sung by their partner (coordination), who then implicitly asked them for help (cooperation). Before and after this cooperation–coordination task, the children performed a helping task that assessed their ToM skills: the ability to infer social partners’ intentions. Despite the regularity and predictability of the robot, children made the most progress in the helping task after interacting with a human. Motor coupling was more stable in child–human than in child–robot dyads. The ability of the social partner to actively maintain a stable social coupling seems to be a primary factor inciting the child to learn and transfer the just-practiced social skills.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"47 1","pages":""},"PeriodicalIF":4.7,"publicationDate":"2023-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138543758","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Moffuly-II: A Robot that Hugs and Rubs Heads Moffuly-II:一个拥抱和摩擦头部的机器人
IF 4.7 2区 计算机科学
International Journal of Social Robotics Pub Date : 2023-12-05 DOI: 10.1007/s12369-023-01070-5
Yuya Onishi, Hidenobu Sumioka, Masahiro Shiomi
{"title":"Moffuly-II: A Robot that Hugs and Rubs Heads","authors":"Yuya Onishi, Hidenobu Sumioka, Masahiro Shiomi","doi":"10.1007/s12369-023-01070-5","DOIUrl":"https://doi.org/10.1007/s12369-023-01070-5","url":null,"abstract":"<p>Although whole-body touch interaction, e.g., hugging, is essential for human beings from various perspectives, not everyone can interact with intimate friends/family due to physical separations caused by such circumstances as pandemics, geographical constraints, etc. The possibility of human–robot touch interaction is one approach that ameliorates such missing touch interactions. In this study, we developed a robot named Moffuly-II, that hugs people and rubs their heads during a hug because head-touching behaviors are typical affective interactions between intimate persons. Moffuly-II is a large huggable teddy-bear type robot and it has enough capability to both hug and touch the head. We conducted an experiment with human participants and evaluated the effectiveness of combining intra-hug gestures (squeezing and rubbing) and the touch area (back and head). From experimental results, we identified the advantages of implementing rubbing gestures compared to squeezing gestures and some of the advantages of head-touching behaviors compared to back-touching behaviors.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"875 1","pages":""},"PeriodicalIF":4.7,"publicationDate":"2023-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138529961","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Experimental and Integrative Approaches to Robo-ethics. An Introduction 机器人伦理学的实验和综合方法。简介
IF 4.7 2区 计算机科学
International Journal of Social Robotics Pub Date : 2023-12-04 DOI: 10.1007/s12369-023-01084-z
Francesco Bianchini, Luisa Damiano, E. Datteri, Pierluigi Graziani
{"title":"Experimental and Integrative Approaches to Robo-ethics. An Introduction","authors":"Francesco Bianchini, Luisa Damiano, E. Datteri, Pierluigi Graziani","doi":"10.1007/s12369-023-01084-z","DOIUrl":"https://doi.org/10.1007/s12369-023-01084-z","url":null,"abstract":"","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"61 3","pages":""},"PeriodicalIF":4.7,"publicationDate":"2023-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138604881","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Do Robots Have Sex? A Prolegomenon 机器人会做爱吗?一个绪论
IF 4.7 2区 计算机科学
International Journal of Social Robotics Pub Date : 2023-12-04 DOI: 10.1007/s12369-023-01052-7
Robert Sparrow, Eliana Horn, Friederike Eyssel
{"title":"Do Robots Have Sex? A Prolegomenon","authors":"Robert Sparrow, Eliana Horn, Friederike Eyssel","doi":"10.1007/s12369-023-01052-7","DOIUrl":"https://doi.org/10.1007/s12369-023-01052-7","url":null,"abstract":"<p>Research in Human–Robot Interaction (HRI) suggests that people attribute gender to (some) robots. In this paper we outline a program of research on the gendering of robots and on the ethical issues raised by such gendering. Understanding which robots are gendered, when, and why, will require careful research in HRI, drawing on anthropology and social psychology, informed by state-of-the-art research in gender studies and critical theory. Design features of robots that might influence the attribution of gender include: appearance; tone of voice; speech repertoire; range and style of movement; behaviour; and, intended function. Robots may be gendered differently depending on: the age, class, sex, ethnicity, and sexuality of the person doing the attributing; local cultural histories; social cues from the designers, the physical and institutional environment, and other users; and the role of the robot. An adequate account of the gender of robots will also need to pay attention to the limits of a sex/gender distinction, which has historically been maintained by reference to a “sex” located in a biological body, when it comes to theorising the gender of robots. We argue that, on some accounts of what it is to be sexed, robots might “have” sex: they might be male and female in just the same way as (most) human beings are. Addressing the ethical issues raised by the gendering of robots will require further progress in “robot media ethics”, as well as an account of the responsibilities of both designers and users in a broader social context.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"36 1","pages":""},"PeriodicalIF":4.7,"publicationDate":"2023-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138529960","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Artificial Emotions: Toward a Human-Centric Ethics 人工情感:迈向以人为本的伦理学
IF 4.7 2区 计算机科学
International Journal of Social Robotics Pub Date : 2023-12-01 DOI: 10.1007/s12369-022-00890-1
Laura Corti, Nicola Di Stefano, Marta Bertolaso
{"title":"Artificial Emotions: Toward a Human-Centric Ethics","authors":"Laura Corti, Nicola Di Stefano, Marta Bertolaso","doi":"10.1007/s12369-022-00890-1","DOIUrl":"https://doi.org/10.1007/s12369-022-00890-1","url":null,"abstract":"","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"40 4","pages":"2039-2053"},"PeriodicalIF":4.7,"publicationDate":"2023-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139187759","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Building Long-Term Human–Robot Relationships: Examining Disclosure, Perception and Well-Being Across Time 建立长期的人机关系:检查披露,感知和幸福的时间
IF 4.7 2区 计算机科学
International Journal of Social Robotics Pub Date : 2023-11-30 DOI: 10.1007/s12369-023-01076-z
Guy Laban, Arvid Kappas, Val Morrison, Emily S. Cross
{"title":"Building Long-Term Human–Robot Relationships: Examining Disclosure, Perception and Well-Being Across Time","authors":"Guy Laban, Arvid Kappas, Val Morrison, Emily S. Cross","doi":"10.1007/s12369-023-01076-z","DOIUrl":"https://doi.org/10.1007/s12369-023-01076-z","url":null,"abstract":"<p>While interactions with social robots are novel and exciting for many people, one concern is the extent to which people’s behavioural and emotional engagement might be sustained across time, since during initial interactions with a robot, its novelty is especially salient. This challenge is particularly noteworthy when considering interactions designed to support people’s well-being, with limited evidence (or empirical exploration) of social robots’ capacity to support people’s emotional health over time. Accordingly, our aim here was to examine how long-term repeated interactions with a social robot affect people’s self-disclosure behaviour toward the robot, their perceptions of the robot, and how such sustained interactions influence factors related to well-being. We conducted a mediated long-term online experiment with participants conversing with the social robot Pepper 10 times over 5 weeks. We found that people self-disclose increasingly more to a social robot over time, and report the robot to be more social and competent over time. Participants’ moods also improved after talking to the robot, and across sessions, they found the robot’s responses increasingly comforting as well as reported feeling less lonely. Finally, our results emphasize that when the discussion frame was supposedly more emotional (in this case, framing questions in the context of the COVID-19 pandemic), participants reported feeling lonelier and more stressed. These results set the stage for situating social robots as conversational partners and provide crucial evidence for their potential inclusion in interventions supporting people’s emotional health through encouraging self-disclosure.\u0000</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"875 1","pages":""},"PeriodicalIF":4.7,"publicationDate":"2023-11-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138529968","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Expanding the Interaction Repertoire of a Social Drone: Physically Expressive Possibilities of a Perched BiRDe 扩大社会无人机的互动剧目:一个栖息的鸟的物理表达的可能性
IF 4.7 2区 计算机科学
International Journal of Social Robotics Pub Date : 2023-11-29 DOI: 10.1007/s12369-023-01079-w
Ori Fartook, Karon MacLean, Tal Oron-Gilad, Jessica R. Cauchard
{"title":"Expanding the Interaction Repertoire of a Social Drone: Physically Expressive Possibilities of a Perched BiRDe","authors":"Ori Fartook, Karon MacLean, Tal Oron-Gilad, Jessica R. Cauchard","doi":"10.1007/s12369-023-01079-w","DOIUrl":"https://doi.org/10.1007/s12369-023-01079-w","url":null,"abstract":"<p>The field of human–drone interaction (HDI) has investigated an increasing number of applications for social drones, all while focusing on the drone’s inherent ability to fly, thus overpassing interaction opportunities, such as a drone in its perched (i.e., non-flying) state. A drone cannot constantly fly and a need for more realistic HDI is needed, therefore, in this exploratory work, we have decoupled a social drone’s flying state from its perched state and investigated user interpretations of its physical rendering. To do so, we designed and developed BiRDe: a Bodily expressIons and Respiration Drone conveying Emotions. BiRDe was designed to render a range of emotional states by modulating its respiratory rate (RR) and changing its body posture using reconfigurable wings and head positions. Following its design, a validation study was conducted. In a laboratory study, participants (<span>({N}={30})</span>) observed and labeled twelve of BiRDe’s emotional behaviors using Valence and Arousal based emotional states. We identified consistent patterns in how BiRDe’s RR, wings, and head had influenced perception in terms of valence, arousal, and willingness to interact. Furthermore, participants interpreted 11 out of the 12 behaviors in line with our initial design intentions. This work demonstrates a drone’s ability to communicate emotions even while perched and offers design implications and future applications.\u0000</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"875 1","pages":""},"PeriodicalIF":4.7,"publicationDate":"2023-11-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138529980","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信