{"title":"A System For P300 Detection Applied To Vehicle Navigation","authors":"Riley Magee, S. Givigi","doi":"10.1109/SysCon48628.2021.9447129","DOIUrl":null,"url":null,"abstract":"Brain-machine interface (BMI) systems are used to classify biological signals from the brain, such as electroencephalogram (EEG) data, to determine control commands. There are several different signals that can be used for the interface. Among them, one finds the P300 signal. The P300 signal is a potential signal that is passively produced when a user observes, hears or pays attention to a desired stimulus. This signal has been used in conjunction with a graphical user interface (GUI) to allow a person to choose commands from a list of possible actions. Traditionally, the visual stimuli are repeated and averaged to increase classification accuracy, which, in turn, reduces the maximum possible command rate. In order to improve command rate, this paper describes a system wherein feature extraction and classifier training could be tested offline. Then, live testing in a mobile robot steering simulation was carried out. Finally, a live experiment is reported. The features to be used in classification are selected using a genetic algorithm (GA). Using the chosen features, 78.3% signal detection accuracy was achieved for single epochs. Using multiple-epochs to improve classifier performance in simulated and real-world steering experiments we were able to successfully navigate a simple maze while maintaining classifier accuracy (Sim: $79.9 \\pm 5.3$%, Real: $88.8\\pm 10.1$%).","PeriodicalId":384949,"journal":{"name":"2021 IEEE International Systems Conference (SysCon)","volume":"39 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE International Systems Conference (SysCon)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SysCon48628.2021.9447129","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Brain-machine interface (BMI) systems are used to classify biological signals from the brain, such as electroencephalogram (EEG) data, to determine control commands. There are several different signals that can be used for the interface. Among them, one finds the P300 signal. The P300 signal is a potential signal that is passively produced when a user observes, hears or pays attention to a desired stimulus. This signal has been used in conjunction with a graphical user interface (GUI) to allow a person to choose commands from a list of possible actions. Traditionally, the visual stimuli are repeated and averaged to increase classification accuracy, which, in turn, reduces the maximum possible command rate. In order to improve command rate, this paper describes a system wherein feature extraction and classifier training could be tested offline. Then, live testing in a mobile robot steering simulation was carried out. Finally, a live experiment is reported. The features to be used in classification are selected using a genetic algorithm (GA). Using the chosen features, 78.3% signal detection accuracy was achieved for single epochs. Using multiple-epochs to improve classifier performance in simulated and real-world steering experiments we were able to successfully navigate a simple maze while maintaining classifier accuracy (Sim: $79.9 \pm 5.3$%, Real: $88.8\pm 10.1$%).