R. Meattini, U. Scarcia, C. Melchiorri, Tony Belpaeme
{"title":"Gestural art: A Steady State Visual Evoked Potential (SSVEP) based Brain Computer Interface to express intentions through a robotic hand","authors":"R. Meattini, U. Scarcia, C. Melchiorri, Tony Belpaeme","doi":"10.1109/ROMAN.2014.6926255","DOIUrl":null,"url":null,"abstract":"We present an automated solution for the acquisition, processing and classification of electroencephalography (EEG) signals in order to remotely control a remotely located robotic hand executing communicative gestures. The Brain-Computer Interface (BCI) was implemented using the Steady State Visual Evoked Potential (SSVEP) approach, a low-latency and low-noise method for reading multiple non-time-locked states from EEG signals. As EEG sensor, the low-cost commercial Emotiv EPOC headset was used to acquire signals from the parietal and occipital lobes. The data processing chain is implemented in OpenViBE, a dedicated software platform for designing, testing and applying Brain-Computer Interfaces. Recorded commands were communicated to an external server through a Virtual Reality Peripheral Network (VRPN) interface. During the training phase, the user controlled a local simulation of a dexterous robot hand, allowing for a safe environment in which to train. After training, the user's commands were used to remotely control a real dexterous robot hand located in Bologna (Italy) from Plymouth (UK). We report on the robustness, accuracy and latency of the setup.","PeriodicalId":235810,"journal":{"name":"The 23rd IEEE International Symposium on Robot and Human Interactive Communication","volume":"164 2","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"The 23rd IEEE International Symposium on Robot and Human Interactive Communication","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ROMAN.2014.6926255","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6
Abstract
We present an automated solution for the acquisition, processing and classification of electroencephalography (EEG) signals in order to remotely control a remotely located robotic hand executing communicative gestures. The Brain-Computer Interface (BCI) was implemented using the Steady State Visual Evoked Potential (SSVEP) approach, a low-latency and low-noise method for reading multiple non-time-locked states from EEG signals. As EEG sensor, the low-cost commercial Emotiv EPOC headset was used to acquire signals from the parietal and occipital lobes. The data processing chain is implemented in OpenViBE, a dedicated software platform for designing, testing and applying Brain-Computer Interfaces. Recorded commands were communicated to an external server through a Virtual Reality Peripheral Network (VRPN) interface. During the training phase, the user controlled a local simulation of a dexterous robot hand, allowing for a safe environment in which to train. After training, the user's commands were used to remotely control a real dexterous robot hand located in Bologna (Italy) from Plymouth (UK). We report on the robustness, accuracy and latency of the setup.