ACM Transactions on Human-Robot Interaction最新文献

筛选
英文 中文
Towards Designing Companion Robots with the End in Mind 面向终端的同伴机器人设计
IF 5.1
ACM Transactions on Human-Robot Interaction Pub Date : 2023-03-13 DOI: 10.1145/3568294.3580046
Waki Kamino
{"title":"Towards Designing Companion Robots with the End in Mind","authors":"Waki Kamino","doi":"10.1145/3568294.3580046","DOIUrl":"https://doi.org/10.1145/3568294.3580046","url":null,"abstract":"This paper presents an early-stage idea of using 'robot death' as an integral component of human-robot interaction design for companion robots. Reviewing previous discussions around the deaths of companion robots in real-life and popular culture contexts, and analyzing the lifelike design of current companion robots in the market, the paper explores the potential advantages of designing companion robots and human-robot interaction with their 'death' in mind.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":null,"pages":null},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90142545","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Designing a Robot which Touches the User's Head with Intra-Hug Gestures 设计一个用拥抱手势触摸用户头部的机器人
IF 5.1
ACM Transactions on Human-Robot Interaction Pub Date : 2023-03-13 DOI: 10.1145/3568294.3580096
Yuya Onishi, H. Sumioka, M. Shiomi
{"title":"Designing a Robot which Touches the User's Head with Intra-Hug Gestures","authors":"Yuya Onishi, H. Sumioka, M. Shiomi","doi":"10.1145/3568294.3580096","DOIUrl":"https://doi.org/10.1145/3568294.3580096","url":null,"abstract":"There are a lot of positive benefits of hugging, and several studies have applied its application in human-robot interaction. However, due to the limitation of a robot performance, these robots only touched the human's back. In this study, we developed a hug robot, named \"Moffuly-II.\" This robot can hug not only with intra-hug gestures, but also touch the user's back or head. This paper describes the robot system and the user's impression of hug with the robot.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":null,"pages":null},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90227931","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
On Using Social Signals to Enable Flexible Error-Aware HRI 利用社会信号实现灵活的错误感知HRI
IF 5.1
ACM Transactions on Human-Robot Interaction Pub Date : 2023-03-13 DOI: 10.1145/3568162.3576990
Maia Stiber, R. Taylor, Chien-Ming Huang
{"title":"On Using Social Signals to Enable Flexible Error-Aware HRI","authors":"Maia Stiber, R. Taylor, Chien-Ming Huang","doi":"10.1145/3568162.3576990","DOIUrl":"https://doi.org/10.1145/3568162.3576990","url":null,"abstract":"Prior error management techniques often do not possess the versatility to appropriately address robot errors across tasks and scenarios. Their fundamental framework involves explicit, manual error management and implicit domain-specific information driven error management, tailoring their response for specific interaction contexts. We present a framework for approaching error-aware systems by adding implicit social signals as another information channel to create more flexibility in application. To support this notion, we introduce a novel dataset (composed of three data collections) with a focus on understanding natural facial action unit (AU) responses to robot errors during physical-based human-robot interactions---varying across task, error, people, and scenario. Analysis of the dataset reveals that, through the lens of error detection, using AUs as input into error management affords flexibility to the system and has the potential to improve error detection response rate. In addition, we provide an example real-time interactive robot error management system using the error-aware framework.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":null,"pages":null},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79274916","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Robot-Supported Information Search: Which Conversational Interaction Style do Children Prefer? 机器人支持的信息搜索:儿童更喜欢哪种会话交互方式?
IF 5.1
ACM Transactions on Human-Robot Interaction Pub Date : 2023-03-13 DOI: 10.1145/3568294.3580128
Suyash Sharma, T. Beelen, K. Truong
{"title":"Robot-Supported Information Search: Which Conversational Interaction Style do Children Prefer?","authors":"Suyash Sharma, T. Beelen, K. Truong","doi":"10.1145/3568294.3580128","DOIUrl":"https://doi.org/10.1145/3568294.3580128","url":null,"abstract":"Searching via speech with a robot can be used to better support children in expressing their information needs. We report on an exploratory study where children (N=35) worked on search tasks with two robots using different interaction styles. One system posed closed, yes/no questions and was more system-driven while the other system used open-ended questions and was more user-driven. We studied children's preferences and experiences of these interaction styles using questionnaires and semi-structured interviews. We found no overall strong preference between the interaction styles. However, some children reported task-dependent preferences. We further report on children's interpretation and reasoning around interaction styles for robots supporting information search.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":null,"pages":null},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84693712","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Effects of Predictive Robot Eyes on Trust and Task Performance in an Industrial Cooperation Task 预测机器人眼对工业协作任务中信任和任务绩效的影响
IF 5.1
ACM Transactions on Human-Robot Interaction Pub Date : 2023-03-13 DOI: 10.1145/3568294.3580123
L. Onnasch, Paul Schweidler, Maximilian Wieser
{"title":"Effects of Predictive Robot Eyes on Trust and Task Performance in an Industrial Cooperation Task","authors":"L. Onnasch, Paul Schweidler, Maximilian Wieser","doi":"10.1145/3568294.3580123","DOIUrl":"https://doi.org/10.1145/3568294.3580123","url":null,"abstract":"Industrial cobots can perform variable action sequences. For human-robot interaction (HRI) this can have detrimental effects, as the robot's actions can be difficult to predict. In human interaction, eye gaze intuitively directs attention and communicates subsequent actions. Whether this mechanism can benefit HRI, too, is not well understood. This study investigated the impact of anthropomorphic eyes as directional cues in robot design. 42 participants worked on two subsequent tasks in an embodied HRI with a Sawyer robot. The study used a between-subject design and presented either anthropomorphic eyes, arrows or a black screen as control condition on the robot's display. Results showed that neither directional stimuli nor the anthropomorphic design in particular led to increased trust. But anthropomorphic robot eyes improved the prediction speed, whereas this effect could not be found for non-anthropomorphic cues (arrows). Anthropomorphic eyes therefore seem to be better suitable for an implementation on an industrial robot.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":null,"pages":null},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88185611","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
People Dynamically Update Trust When Interactively Teaching Robots 交互式教学机器人时,人们动态更新信任
IF 5.1
ACM Transactions on Human-Robot Interaction Pub Date : 2023-03-13 DOI: 10.1145/3568162.3576962
V. B. Chi, B. Malle
{"title":"People Dynamically Update Trust When Interactively Teaching Robots","authors":"V. B. Chi, B. Malle","doi":"10.1145/3568162.3576962","DOIUrl":"https://doi.org/10.1145/3568162.3576962","url":null,"abstract":"Human-robot trust research often measures people's trust in robots in individual scenarios. However, humans may update their trust dynamically as they continuously interact with a robot. In a well-powered study (n = 220), we investigate the trust updating process across a 15-trial interaction. In a novel paradigm, participants act in the role of teacher to a simulated robot on a smartphone-based platform, and we assess trust at multiple levels (momentary trust feelings, perceptions of trustworthiness, and intended reliance). Results reveal that people are highly sensitive to the robot's learning progress trial by trial: they take into account both previous-task performance, current-task difficulty, and cumulative learning across training. More integrative perceptions of robot trustworthiness steadily grow as people gather more evidence from observing robot performance, especially of faster-learning robots. Intended reliance on the robot in novel tasks increased only for faster-learning robots.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":null,"pages":null},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91039338","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Reactive Planning for Coordinated Handover of an Autonomous Aerial Manipulator 自主航空机械臂协调交接的反应性规划
IF 5.1
ACM Transactions on Human-Robot Interaction Pub Date : 2023-03-13 DOI: 10.1145/3568294.3580055
Jérôme Truc, D. Sidobre, R. Alami
{"title":"Reactive Planning for Coordinated Handover of an Autonomous Aerial Manipulator","authors":"Jérôme Truc, D. Sidobre, R. Alami","doi":"10.1145/3568294.3580055","DOIUrl":"https://doi.org/10.1145/3568294.3580055","url":null,"abstract":"In this paper, we present a coordinated and reactive human-aware motion planner for performing a handover task by an autonomous aerial manipulator (AAM). We present a method to determine the final state of the AAM for a handover task based on the current state of the human and the surrounding obstacles. We consider the visual field of the human and the effort to turn the head and see the AAM as well as the discomfort caused to the human. We apply these social constraints together with the kinematic constraints of the AAM to determine its coordinated motion along the trajectory.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":null,"pages":null},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91306181","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The Robot Made Us Hear Each Other: Fostering Inclusive Conversations among Mixed-Visual Ability Children 机器人让我们彼此倾听:培养混合视觉能力儿童之间的包容性对话
IF 5.1
ACM Transactions on Human-Robot Interaction Pub Date : 2023-03-13 DOI: 10.1145/3568162.3576997
Isabel Neto, Filipa Correia, Filipa Rocha, Patricia Piedade, Ana Paiva, Hugo Nicolau
{"title":"The Robot Made Us Hear Each Other: Fostering Inclusive Conversations among Mixed-Visual Ability Children","authors":"Isabel Neto, Filipa Correia, Filipa Rocha, Patricia Piedade, Ana Paiva, Hugo Nicolau","doi":"10.1145/3568162.3576997","DOIUrl":"https://doi.org/10.1145/3568162.3576997","url":null,"abstract":"Inclusion is key in group work and collaborative learning. We developed a mediator robot to support and promote inclusion in group conversations, particularly in groups composed of children with and without visual impairment. We investigate the effect of two mediation strategies on group dynamics, inclusion, and perception of the robot. We conducted a within-subjects study with 78 children, 26 experienced visual impairments, in a decision-making activity. Results indicate that the robot can foster inclusion in mixed-visual ability group conversations. The robot succeeds in balancing participation, particularly when using a highly intervening mediating strategy (directive strategy). However, children feel more heard by their peers when the robot is less intervening (organic strategy). We extend prior work on social robots to assist group work and contribute with a mediator robot that enables children with visual impairments to engage equally in group conversations. We finish by discussing design implications for inclusive social robots.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":null,"pages":null},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90817996","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Perception-Intention-Action Cycle as a Human Acceptable Way for Improving Human-Robot Collaborative Tasks 感知-意图-行动循环是改善人机协作任务的人类可接受方式
IF 5.1
ACM Transactions on Human-Robot Interaction Pub Date : 2023-03-13 DOI: 10.1145/3568294.3580149
J. E. Domínguez-Vidal, Nicolás Rodríguez, A. Sanfeliu
{"title":"Perception-Intention-Action Cycle as a Human Acceptable Way for Improving Human-Robot Collaborative Tasks","authors":"J. E. Domínguez-Vidal, Nicolás Rodríguez, A. Sanfeliu","doi":"10.1145/3568294.3580149","DOIUrl":"https://doi.org/10.1145/3568294.3580149","url":null,"abstract":"In Human-Robot Collaboration (HRC) tasks, the classical Perception-Action cycle can not fully explain the collaborative behaviour of the human-robot pair until it is extended to Perception-Intention-Action (PIA) cycle, giving to the human's intention a key role at the same level of the robot's perception and not as a subblock of this. Although part of the human's intention can be perceived or inferred by the other agent, this is prone to misunderstandings so the true intention has to be explicitly informed in some cases to fulfill the task. Here, we explore both types of intention and we combine them with the robot's perception through the concept of Situation Awareness (SA). We validate the PIA cycle and its acceptance by the user with a preliminary experiment in an object transportation task showing that its usage can increase trust in the robot.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":null,"pages":null},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89696048","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Making Music More Inclusive with Hospiano Hospiano让音乐更具包容性
IF 5.1
ACM Transactions on Human-Robot Interaction Pub Date : 2023-03-13 DOI: 10.1145/3568294.3580184
Chacharin Lertyosbordin, Nichaput Khurukitwanit, Teeratas Asavareongchai, Sirin Liukasemsarn
{"title":"Making Music More Inclusive with Hospiano","authors":"Chacharin Lertyosbordin, Nichaput Khurukitwanit, Teeratas Asavareongchai, Sirin Liukasemsarn","doi":"10.1145/3568294.3580184","DOIUrl":"https://doi.org/10.1145/3568294.3580184","url":null,"abstract":"Music brings people together; it is a universal language that can help us be more expressive and help us understand our feelings and emotions in a better manner. The \"Hospiano\" robot is a prototype developed with the goal of making music accessible to all, regardless of physical ability. The robot acts as a pianist and can be placed in hospital lobbies and wards, playing the piano in response to the gestures and facial expressions of patients (i.e. head movement, eye and mouth movement, and proximity). It has three main modes of operation: \"Robot Pianist mode\", in which it plays pre-existing songs; \"Play Along mode\", which allows anyone to interact with the music; and \"Composer mode\", which allows patients to create their own music. The software that controls the prototype's actions runs on the Robot Operating System (ROS). It has been proven that humans and robots can interact fluently via a robot's vision, which opens up a wide range of possibilities for further interactions between these logical machines and more emotive beings like humans, resulting in an improvement in the quality of life of people who use it, increased inclusivity, and a better world for future generations to live in.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":null,"pages":null},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90366005","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信