Conducting neuropsychological tests with a humanoid robot: Design and evaluation

Duc-Canh Nguyen, G. Bailly, F. Elisei
{"title":"Conducting neuropsychological tests with a humanoid robot: Design and evaluation","authors":"Duc-Canh Nguyen, G. Bailly, F. Elisei","doi":"10.1109/COGINFOCOM.2016.7804572","DOIUrl":null,"url":null,"abstract":"Socially assistive robot with interactive behavioral capability have been improving quality of life for a wide range of users by taking care of elderlies, training individuals with cognitive disabilities or physical rehabilitation, etc. While the interactive behavioral policies of most systems are scripted, we discuss here key features of a new methodology that enables professional caregivers to teach a socially assistive robot (SAR) how to perform the assistive tasks while giving proper instructions, demonstrations and feedbacks. We describe here how socio-communicative gesture controllers - which actually control the speech, the facial displays and hand gestures of our iCub robot - are driven by multimodal events captured on a professional human demonstrator performing a neuropsychological interview. Furthermore, we propose an original online evaluation method for rating the multimodal interactive behaviors of the SAR and show how such a method can help designers to identify the faulty events.","PeriodicalId":440408,"journal":{"name":"2016 7th IEEE International Conference on Cognitive Infocommunications (CogInfoCom)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 7th IEEE International Conference on Cognitive Infocommunications (CogInfoCom)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/COGINFOCOM.2016.7804572","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8

Abstract

Socially assistive robot with interactive behavioral capability have been improving quality of life for a wide range of users by taking care of elderlies, training individuals with cognitive disabilities or physical rehabilitation, etc. While the interactive behavioral policies of most systems are scripted, we discuss here key features of a new methodology that enables professional caregivers to teach a socially assistive robot (SAR) how to perform the assistive tasks while giving proper instructions, demonstrations and feedbacks. We describe here how socio-communicative gesture controllers - which actually control the speech, the facial displays and hand gestures of our iCub robot - are driven by multimodal events captured on a professional human demonstrator performing a neuropsychological interview. Furthermore, we propose an original online evaluation method for rating the multimodal interactive behaviors of the SAR and show how such a method can help designers to identify the faulty events.
用类人机器人进行神经心理测试:设计与评估
具有互动行为能力的社会辅助机器人通过照顾老年人、训练有认知障碍的个体或身体康复等方式改善了广大用户的生活质量。虽然大多数系统的交互行为策略都是脚本化的,但我们在这里讨论了一种新方法的关键特性,该方法使专业护理人员能够教社交辅助机器人(SAR)如何在给出适当的指示、演示和反馈的同时执行辅助任务。我们在这里描述了社会交际手势控制器——它实际上控制着我们的iCub机器人的语音、面部显示和手势——是如何由一个专业的人类演示者进行神经心理学采访时捕获的多模态事件驱动的。此外,我们提出了一种原创的在线评估方法来评估SAR的多模态交互行为,并展示了这种方法如何帮助设计人员识别故障事件。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信