{"title":"基于变压器模型的脑电分类","authors":"Jiayao Sun, J. Xie, Huihui Zhou","doi":"10.1109/LifeTech52111.2021.9391844","DOIUrl":null,"url":null,"abstract":"Transformer has been widely used in the field of natural language processing (NLP) with its superior ability to handle long-range dependencies in comparison with convolutional neural network (CNN) and recurrent neural network (RNN). This correlation is also important for the recognition of time series signals, such as electroencephalogram (EEG). Currently, commonly used EEG classification models are CNN, RNN, deep believe network (DBN), and hybrid CNN. Transformer has not been used in EEG recognition. In this study, we constructed multiple Transformer-based models for motor imaginary (MI) EEG classification, and obtained superior performances in comparison with the previous state-of-art. We found that the activities of the motor cortex had a great contribution to classification in our model through visualization, and positional embedding (PE) method could improve classification accuracy. These results suggest that the attention mechanism of Transformer combined with CNN might be a powerful model for the recognition of sequence data.","PeriodicalId":274908,"journal":{"name":"2021 IEEE 3rd Global Conference on Life Sciences and Technologies (LifeTech)","volume":"21 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"33","resultStr":"{\"title\":\"EEG Classification with Transformer-Based Models\",\"authors\":\"Jiayao Sun, J. Xie, Huihui Zhou\",\"doi\":\"10.1109/LifeTech52111.2021.9391844\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Transformer has been widely used in the field of natural language processing (NLP) with its superior ability to handle long-range dependencies in comparison with convolutional neural network (CNN) and recurrent neural network (RNN). This correlation is also important for the recognition of time series signals, such as electroencephalogram (EEG). Currently, commonly used EEG classification models are CNN, RNN, deep believe network (DBN), and hybrid CNN. Transformer has not been used in EEG recognition. In this study, we constructed multiple Transformer-based models for motor imaginary (MI) EEG classification, and obtained superior performances in comparison with the previous state-of-art. We found that the activities of the motor cortex had a great contribution to classification in our model through visualization, and positional embedding (PE) method could improve classification accuracy. These results suggest that the attention mechanism of Transformer combined with CNN might be a powerful model for the recognition of sequence data.\",\"PeriodicalId\":274908,\"journal\":{\"name\":\"2021 IEEE 3rd Global Conference on Life Sciences and Technologies (LifeTech)\",\"volume\":\"21 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-03-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"33\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 IEEE 3rd Global Conference on Life Sciences and Technologies (LifeTech)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/LifeTech52111.2021.9391844\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE 3rd Global Conference on Life Sciences and Technologies (LifeTech)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/LifeTech52111.2021.9391844","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Transformer has been widely used in the field of natural language processing (NLP) with its superior ability to handle long-range dependencies in comparison with convolutional neural network (CNN) and recurrent neural network (RNN). This correlation is also important for the recognition of time series signals, such as electroencephalogram (EEG). Currently, commonly used EEG classification models are CNN, RNN, deep believe network (DBN), and hybrid CNN. Transformer has not been used in EEG recognition. In this study, we constructed multiple Transformer-based models for motor imaginary (MI) EEG classification, and obtained superior performances in comparison with the previous state-of-art. We found that the activities of the motor cortex had a great contribution to classification in our model through visualization, and positional embedding (PE) method could improve classification accuracy. These results suggest that the attention mechanism of Transformer combined with CNN might be a powerful model for the recognition of sequence data.