Abhijeet Ravankar, Ankit A. Ravankar, Arpit Rawankar
{"title":"Development of a brain–machine interface based robot navigation system for disabled people","authors":"Abhijeet Ravankar, Ankit A. Ravankar, Arpit Rawankar","doi":"10.1007/s10015-025-01024-2","DOIUrl":null,"url":null,"abstract":"<div><p>People with serious physical disabilities (ex. spinal muscular atrophy) find it difficult to control a robot wheelchair. Although gesture-based robot control mechanisms have been proposed, making such gestures is not always feasible. To this end, this paper proposes a brain–machine interface (BMI) for robot control by processing electroencephalograph (EEG) signals captured from non-invasive external device. We systematically process the EEG signals to first estimate the most prominent brain channels. This eliminates the redundant information or noise which adversely influences the recognition accuracy. We then estimate the most prominent EEG waves among the prominent channels. Later, the combination of prominent brain waves among the prominent channels which gives the most accurate robot control are estimated. Convolutional neural network (CNN) is used to process the EEG signals. The user can control the robot in four different directions. Experiments with actual external BMI device are performed and robot is controlled.</p></div>","PeriodicalId":46050,"journal":{"name":"Artificial Life and Robotics","volume":"30 3","pages":"398 - 406"},"PeriodicalIF":0.8000,"publicationDate":"2025-05-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Artificial Life and Robotics","FirstCategoryId":"1085","ListUrlMain":"https://link.springer.com/article/10.1007/s10015-025-01024-2","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0
Abstract
People with serious physical disabilities (ex. spinal muscular atrophy) find it difficult to control a robot wheelchair. Although gesture-based robot control mechanisms have been proposed, making such gestures is not always feasible. To this end, this paper proposes a brain–machine interface (BMI) for robot control by processing electroencephalograph (EEG) signals captured from non-invasive external device. We systematically process the EEG signals to first estimate the most prominent brain channels. This eliminates the redundant information or noise which adversely influences the recognition accuracy. We then estimate the most prominent EEG waves among the prominent channels. Later, the combination of prominent brain waves among the prominent channels which gives the most accurate robot control are estimated. Convolutional neural network (CNN) is used to process the EEG signals. The user can control the robot in four different directions. Experiments with actual external BMI device are performed and robot is controlled.