Jing Li , Xinyi Min , Tian-jian Luo , Haoyang Peng , Huosheng Hu , Shen-rui Wu , Xin-jie Lu , Hua Peng
{"title":"Brain-computer interface controlled robotic choreography based on motor imagery EEG","authors":"Jing Li , Xinyi Min , Tian-jian Luo , Haoyang Peng , Huosheng Hu , Shen-rui Wu , Xin-jie Lu , Hua Peng","doi":"10.1016/j.entcom.2025.101016","DOIUrl":null,"url":null,"abstract":"<div><div>Brain-computer interface (BCI) provides the ability of the human brain to control external devices directly, and human-robot interaction (HRI) can develop a robot’s autonomous, cognitive, and social abilities. Based on BCI and HRI, this study proposes a brain-controlled robotic choreography approach based on motor imagery, and it belongs to the category of cooperative human-robot dance. Moreover, the whole system includes two parts: offline training and online calibrating. In offline training, a robot-guided experimental paradigm of motor imagery (MI) was first constructed, and electroencephalogram (EEG) samples of MI were collected for training a convolutional neural network (CNN) model. During online calibration, to achieve the robotic choreography, we used a biped humanoid robot named “Yanshee” as the carrier of robotic dance, and the corresponding dance motion library and mapping rules were designed. Based on the well-trained CNN model, a majority voting strategy was used to keep robust recognition, and the recognized MI command was used to drive robotic choreography based on such library and rules. Experimental results have shown an average accuracy of 74.71% for offline classification among seven subjects. Three online controlling strategies have been applied to seven subjects, and an average of 75.40% classification accuracy has been achieved. To measure the brain-controlled robotic choreography, four invited experts gave an overall average score of 7.67 on 21 robotic dance works using a 10-point scale. The constructed framework gives a novel view to integrate science and art, further developing a new entertainment application of social robots.</div></div>","PeriodicalId":55997,"journal":{"name":"Entertainment Computing","volume":"55 ","pages":"Article 101016"},"PeriodicalIF":2.4000,"publicationDate":"2025-08-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Entertainment Computing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1875952125000965","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, CYBERNETICS","Score":null,"Total":0}
引用次数: 0
Abstract
Brain-computer interface (BCI) provides the ability of the human brain to control external devices directly, and human-robot interaction (HRI) can develop a robot’s autonomous, cognitive, and social abilities. Based on BCI and HRI, this study proposes a brain-controlled robotic choreography approach based on motor imagery, and it belongs to the category of cooperative human-robot dance. Moreover, the whole system includes two parts: offline training and online calibrating. In offline training, a robot-guided experimental paradigm of motor imagery (MI) was first constructed, and electroencephalogram (EEG) samples of MI were collected for training a convolutional neural network (CNN) model. During online calibration, to achieve the robotic choreography, we used a biped humanoid robot named “Yanshee” as the carrier of robotic dance, and the corresponding dance motion library and mapping rules were designed. Based on the well-trained CNN model, a majority voting strategy was used to keep robust recognition, and the recognized MI command was used to drive robotic choreography based on such library and rules. Experimental results have shown an average accuracy of 74.71% for offline classification among seven subjects. Three online controlling strategies have been applied to seven subjects, and an average of 75.40% classification accuracy has been achieved. To measure the brain-controlled robotic choreography, four invited experts gave an overall average score of 7.67 on 21 robotic dance works using a 10-point scale. The constructed framework gives a novel view to integrate science and art, further developing a new entertainment application of social robots.
期刊介绍:
Entertainment Computing publishes original, peer-reviewed research articles and serves as a forum for stimulating and disseminating innovative research ideas, emerging technologies, empirical investigations, state-of-the-art methods and tools in all aspects of digital entertainment, new media, entertainment computing, gaming, robotics, toys and applications among researchers, engineers, social scientists, artists and practitioners. Theoretical, technical, empirical, survey articles and case studies are all appropriate to the journal.