Brain-computer interface controlled robotic choreography based on motor imagery EEG

IF 2.4 3区 计算机科学 Q2 COMPUTER SCIENCE, CYBERNETICS
Jing Li , Xinyi Min , Tian-jian Luo , Haoyang Peng , Huosheng Hu , Shen-rui Wu , Xin-jie Lu , Hua Peng
{"title":"Brain-computer interface controlled robotic choreography based on motor imagery EEG","authors":"Jing Li ,&nbsp;Xinyi Min ,&nbsp;Tian-jian Luo ,&nbsp;Haoyang Peng ,&nbsp;Huosheng Hu ,&nbsp;Shen-rui Wu ,&nbsp;Xin-jie Lu ,&nbsp;Hua Peng","doi":"10.1016/j.entcom.2025.101016","DOIUrl":null,"url":null,"abstract":"<div><div>Brain-computer interface (BCI) provides the ability of the human brain to control external devices directly, and human-robot interaction (HRI) can develop a robot’s autonomous, cognitive, and social abilities. Based on BCI and HRI, this study proposes a brain-controlled robotic choreography approach based on motor imagery, and it belongs to the category of cooperative human-robot dance. Moreover, the whole system includes two parts: offline training and online calibrating. In offline training, a robot-guided experimental paradigm of motor imagery (MI) was first constructed, and electroencephalogram (EEG) samples of MI were collected for training a convolutional neural network (CNN) model. During online calibration, to achieve the robotic choreography, we used a biped humanoid robot named “Yanshee” as the carrier of robotic dance, and the corresponding dance motion library and mapping rules were designed. Based on the well-trained CNN model, a majority voting strategy was used to keep robust recognition, and the recognized MI command was used to drive robotic choreography based on such library and rules. Experimental results have shown an average accuracy of 74.71% for offline classification among seven subjects. Three online controlling strategies have been applied to seven subjects, and an average of 75.40% classification accuracy has been achieved. To measure the brain-controlled robotic choreography, four invited experts gave an overall average score of 7.67 on 21 robotic dance works using a 10-point scale. The constructed framework gives a novel view to integrate science and art, further developing a new entertainment application of social robots.</div></div>","PeriodicalId":55997,"journal":{"name":"Entertainment Computing","volume":"55 ","pages":"Article 101016"},"PeriodicalIF":2.4000,"publicationDate":"2025-08-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Entertainment Computing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1875952125000965","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, CYBERNETICS","Score":null,"Total":0}
引用次数: 0

Abstract

Brain-computer interface (BCI) provides the ability of the human brain to control external devices directly, and human-robot interaction (HRI) can develop a robot’s autonomous, cognitive, and social abilities. Based on BCI and HRI, this study proposes a brain-controlled robotic choreography approach based on motor imagery, and it belongs to the category of cooperative human-robot dance. Moreover, the whole system includes two parts: offline training and online calibrating. In offline training, a robot-guided experimental paradigm of motor imagery (MI) was first constructed, and electroencephalogram (EEG) samples of MI were collected for training a convolutional neural network (CNN) model. During online calibration, to achieve the robotic choreography, we used a biped humanoid robot named “Yanshee” as the carrier of robotic dance, and the corresponding dance motion library and mapping rules were designed. Based on the well-trained CNN model, a majority voting strategy was used to keep robust recognition, and the recognized MI command was used to drive robotic choreography based on such library and rules. Experimental results have shown an average accuracy of 74.71% for offline classification among seven subjects. Three online controlling strategies have been applied to seven subjects, and an average of 75.40% classification accuracy has been achieved. To measure the brain-controlled robotic choreography, four invited experts gave an overall average score of 7.67 on 21 robotic dance works using a 10-point scale. The constructed framework gives a novel view to integrate science and art, further developing a new entertainment application of social robots.
基于运动图像脑电图的脑机接口控制机器人舞蹈
脑机接口(BCI)提供了人脑直接控制外部设备的能力,人机交互(HRI)可以发展机器人的自主、认知和社交能力。本研究基于BCI和HRI,提出了一种基于运动意象的脑控机器人编舞方法,属于人机合作舞蹈范畴。整个系统包括离线培训和在线校准两部分。在离线训练中,首先构建了机器人引导的运动图像(MI)实验范式,并收集了MI的脑电图(EEG)样本,用于训练卷积神经网络(CNN)模型。在在线标定过程中,为了实现机器人编舞,我们使用了名为“燕舍”的两足人形机器人作为机器人舞蹈的载体,并设计了相应的舞蹈动作库和映射规则。在训练良好的CNN模型的基础上,采用多数投票策略保持识别的鲁棒性,并使用识别到的MI命令驱动基于该库和规则的机器人编排。实验结果表明,对7个主题进行离线分类的平均准确率为74.71%。对7个主题应用了3种在线控制策略,平均达到了75.40%的分类准确率。为了衡量大脑控制的机器人舞蹈,四名受邀专家用10分制给21个机器人舞蹈作品给出了7.67分的总体平均分。构建的框架为科学与艺术的融合提供了一个新的视角,进一步发展了社交机器人的新的娱乐应用。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Entertainment Computing
Entertainment Computing Computer Science-Human-Computer Interaction
CiteScore
5.90
自引率
7.10%
发文量
66
期刊介绍: Entertainment Computing publishes original, peer-reviewed research articles and serves as a forum for stimulating and disseminating innovative research ideas, emerging technologies, empirical investigations, state-of-the-art methods and tools in all aspects of digital entertainment, new media, entertainment computing, gaming, robotics, toys and applications among researchers, engineers, social scientists, artists and practitioners. Theoretical, technical, empirical, survey articles and case studies are all appropriate to the journal.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信