Towards a versatile brain-machine interface: Neural decoding of multiple behavioral variables and delivering sensory feedback versatile brain-machine interface
{"title":"Towards a versatile brain-machine interface: Neural decoding of multiple behavioral variables and delivering sensory feedback versatile brain-machine interface","authors":"M. Lebedev","doi":"10.1109/IWW-BCI.2018.8311500","DOIUrl":null,"url":null,"abstract":"While brain-machine interfaces (BMIs) strive to provide neural prosthetic solutions to people with sensory, motor and cognitive disabilities, they have been typically tested in strictly controlled laboratory settings. Making BMIs versatile and applicable to real life situations is a significant challenge. For example, in real life we can flexibly and independently control multiple behavioral variables, such as programming motor goals, orienting attention in space, fixating objects with the eyes, and remembering relevant information. Several neurophysiological experiments, conducted in monkeys, manipulated multiple behavioral variables in a controlled way; these multiple variables were decoded from the activity of same neuronal ensembles. Additionally, in the other monkey experiments, multiple motor variables were extracted from cortical ensembles in real time, such as controlling two virtual arms using a BMI. The next improvement has been achieved using brain-machine-brain interfaces (BMBIs) that simultaneously extract motor intentions from brain activity and generate artificial sensations using intracortical microstimulation (ICMS). For example, a BMBI can perform active tactile exploration of virtual objects. Such versatile BMIs bring us closer to the development of clinical neural prostheses for restoration and rehabilitation of neural function.","PeriodicalId":6537,"journal":{"name":"2018 6th International Conference on Brain-Computer Interface (BCI)","volume":"73 1","pages":"1-2"},"PeriodicalIF":0.0000,"publicationDate":"2018-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 6th International Conference on Brain-Computer Interface (BCI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IWW-BCI.2018.8311500","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
While brain-machine interfaces (BMIs) strive to provide neural prosthetic solutions to people with sensory, motor and cognitive disabilities, they have been typically tested in strictly controlled laboratory settings. Making BMIs versatile and applicable to real life situations is a significant challenge. For example, in real life we can flexibly and independently control multiple behavioral variables, such as programming motor goals, orienting attention in space, fixating objects with the eyes, and remembering relevant information. Several neurophysiological experiments, conducted in monkeys, manipulated multiple behavioral variables in a controlled way; these multiple variables were decoded from the activity of same neuronal ensembles. Additionally, in the other monkey experiments, multiple motor variables were extracted from cortical ensembles in real time, such as controlling two virtual arms using a BMI. The next improvement has been achieved using brain-machine-brain interfaces (BMBIs) that simultaneously extract motor intentions from brain activity and generate artificial sensations using intracortical microstimulation (ICMS). For example, a BMBI can perform active tactile exploration of virtual objects. Such versatile BMIs bring us closer to the development of clinical neural prostheses for restoration and rehabilitation of neural function.