P. Arpaia, C. Bravaccio, G. Corrado, Luigi Duraccio, N. Moccaldi, Silvia Rossi
{"title":"Robotic Autism Rehabilitation by Wearable Brain-Computer Interface and Augmented Reality","authors":"P. Arpaia, C. Bravaccio, G. Corrado, Luigi Duraccio, N. Moccaldi, Silvia Rossi","doi":"10.1109/MeMeA49120.2020.9137144","DOIUrl":null,"url":null,"abstract":"An instrument based on the integration of Brain Computer Interface (BCI) and Augmented Reality (AR) is proposed for robotic autism rehabilitation. Flickering stimuli at fixed frequencies appear on the display of Augmented Reality (AR) glasses. When the user focuses on one of the stimuli a Steady State Visual Evoked Potentials (SSVEP) occurs on his occipital region. A single-channel electroencephalographic Brain Computer Interface detects the elicited SSVEP and sends the corresponding commands to a mobile robot. The device’s high wearability (single channel and dry electrodes), and the trainingless usability are fundamental for the acceptance by Autism Spectrum Disorder (ASD) children. Effectively controlling the movements of a robot through a new channel enhances rehabilitation engagement and effectiveness. A case study at an accredited rehabilitation center on 10 healthy adult subjects highlighted an average accuracy higher than 83%. Preliminary further tests at the Department of Translational Medical Sciences of University of Naples Federico II on 3 ASD patients between 8 and 10 years old provided positive feedback on device acceptance and attentional performance.","PeriodicalId":152478,"journal":{"name":"2020 IEEE International Symposium on Medical Measurements and Applications (MeMeA)","volume":"35 8","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE International Symposium on Medical Measurements and Applications (MeMeA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MeMeA49120.2020.9137144","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9
Abstract
An instrument based on the integration of Brain Computer Interface (BCI) and Augmented Reality (AR) is proposed for robotic autism rehabilitation. Flickering stimuli at fixed frequencies appear on the display of Augmented Reality (AR) glasses. When the user focuses on one of the stimuli a Steady State Visual Evoked Potentials (SSVEP) occurs on his occipital region. A single-channel electroencephalographic Brain Computer Interface detects the elicited SSVEP and sends the corresponding commands to a mobile robot. The device’s high wearability (single channel and dry electrodes), and the trainingless usability are fundamental for the acceptance by Autism Spectrum Disorder (ASD) children. Effectively controlling the movements of a robot through a new channel enhances rehabilitation engagement and effectiveness. A case study at an accredited rehabilitation center on 10 healthy adult subjects highlighted an average accuracy higher than 83%. Preliminary further tests at the Department of Translational Medical Sciences of University of Naples Federico II on 3 ASD patients between 8 and 10 years old provided positive feedback on device acceptance and attentional performance.