{"title":"Classification of Motor Imagery Tasks Using EEG Based on Wavelet Scattering Transform and Convolutional Neural Network","authors":"Rantu Buragohain;Jejariya Ajaybhai;Karan Nathwani;Vinayak Abrol","doi":"10.1109/LSENS.2024.3488356","DOIUrl":null,"url":null,"abstract":"Electroencephalogram (EEG) signal classification is of utmost importance in brain-computer interface (BCI) systems. However, the inherent complex properties of EEG signals pose a challenge in their analysis and modeling. This letter proposes a novel approach of integrating wavelet scattering transform (WST) with convolutional neural network (CNN) for classifying motor imagery (MI) via EEG signals (referred as WST-CNN), capable of extracting distinctive characteristics in signals even when the data is limited. In this architecture, the first layer is nontrainable WST features with fixed initializations in WST-CNN. Furthermore, WSTs are robust to local perturbations in data, especially in the form of translation invariance, and resilient to deformations, thereby enhancing the network's reliability. The performance of the proposed idea is evaluated on the DBCIE dataset for three different scenarios: left-arm (LA) movement, right-arm (RA) movement, and simultaneous movement of both arms (BA). The BCI Competition IV-2a dataset was also employed to validate the proposed concept across four distinct MI tasks, such as movements in: left-hand (LH), right-hand (RH), feet (FT), and tongue (T). The classifications' performance was evaluated in terms of accuracy (\n<inline-formula><tex-math>$\\eta$</tex-math></inline-formula>\n), sensitivity (\n<inline-formula><tex-math>$S_{e}$</tex-math></inline-formula>\n), specificity (\n<inline-formula><tex-math>$S_{p}$</tex-math></inline-formula>\n), and weighted F1-score, which reached up to 92.72%, 92.72%, 97.57%, and 92.75% for classifying LH, RH, FT, and T on the BCI Competition IV-2a dataset and 89.19%, 89.19%, 94.60%, and 89.33% for classifying LA, RA, and BA, on the DBCIE dataset, respectively.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"8 12","pages":"1-4"},"PeriodicalIF":2.2000,"publicationDate":"2024-11-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Letters","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10748355/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Electroencephalogram (EEG) signal classification is of utmost importance in brain-computer interface (BCI) systems. However, the inherent complex properties of EEG signals pose a challenge in their analysis and modeling. This letter proposes a novel approach of integrating wavelet scattering transform (WST) with convolutional neural network (CNN) for classifying motor imagery (MI) via EEG signals (referred as WST-CNN), capable of extracting distinctive characteristics in signals even when the data is limited. In this architecture, the first layer is nontrainable WST features with fixed initializations in WST-CNN. Furthermore, WSTs are robust to local perturbations in data, especially in the form of translation invariance, and resilient to deformations, thereby enhancing the network's reliability. The performance of the proposed idea is evaluated on the DBCIE dataset for three different scenarios: left-arm (LA) movement, right-arm (RA) movement, and simultaneous movement of both arms (BA). The BCI Competition IV-2a dataset was also employed to validate the proposed concept across four distinct MI tasks, such as movements in: left-hand (LH), right-hand (RH), feet (FT), and tongue (T). The classifications' performance was evaluated in terms of accuracy (
$\eta$
), sensitivity (
$S_{e}$
), specificity (
$S_{p}$
), and weighted F1-score, which reached up to 92.72%, 92.72%, 97.57%, and 92.75% for classifying LH, RH, FT, and T on the BCI Competition IV-2a dataset and 89.19%, 89.19%, 94.60%, and 89.33% for classifying LA, RA, and BA, on the DBCIE dataset, respectively.