D. Valbuena, M. Cyriacks, O. Friman, I. Volosyak, A. Graser
{"title":"Brain-Computer Interface for high-level control of rehabilitation robotic systems","authors":"D. Valbuena, M. Cyriacks, O. Friman, I. Volosyak, A. Graser","doi":"10.1109/ICORR.2007.4428489","DOIUrl":null,"url":null,"abstract":"In this work, a brain-computer interface (BCI) based on steady-state visual evoked potentials (SSVEP) is presented as an input device for the human machine interface (HMI) of the semi-autonomous robot FRIEND II. The role of the BCI is to translate high-level requests from the user into control commands for the FRIEND II system. In the current application, the BCI is used to navigate a menu system and to select commands such as pouring a beverage into a glass. The low-level control of the test platform, the rehabilitation robot FRIEND II, is executed by the control architecture MASSiVE, which in turn is served by a planning instance, an environment model and a set of sensors (e.g., machine vision) and actors. The BCI is introduced as a step towards the goal of providing disabled users with at least 1.5 hours independence from care givers.","PeriodicalId":197465,"journal":{"name":"2007 IEEE 10th International Conference on Rehabilitation Robotics","volume":"11 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2007-06-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"81","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2007 IEEE 10th International Conference on Rehabilitation Robotics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICORR.2007.4428489","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 81
Abstract
In this work, a brain-computer interface (BCI) based on steady-state visual evoked potentials (SSVEP) is presented as an input device for the human machine interface (HMI) of the semi-autonomous robot FRIEND II. The role of the BCI is to translate high-level requests from the user into control commands for the FRIEND II system. In the current application, the BCI is used to navigate a menu system and to select commands such as pouring a beverage into a glass. The low-level control of the test platform, the rehabilitation robot FRIEND II, is executed by the control architecture MASSiVE, which in turn is served by a planning instance, an environment model and a set of sensors (e.g., machine vision) and actors. The BCI is introduced as a step towards the goal of providing disabled users with at least 1.5 hours independence from care givers.