{"title":"基于小波增强相位相干特征的脑电想象语音分类","authors":"Anand Mohan;R. S Anand","doi":"10.1109/LSENS.2025.3591964","DOIUrl":null,"url":null,"abstract":"Brain–computer interfaces (BCIs) provide direct communication between the brain and external devices. Using electroencephalogram (EEG) sensors, BCIs are applied in assistive technologies and neuroprosthetics. Among various BCI paradigms, imagined speech-based BCI aims to decode internal speech representations from EEG signals, enabling silent communication. Decoding imagined speech is challenging due to the nonstationarity, intersubject variability of EEG signals and low signal-to-noise ratio. The proposed method uses a multilayer perceptron (MLP) integrated with a convolutional block attention module (CBAM) to enhance feature learning by refining spatial and channel-wise attention. To further improve performance, wavelet-based augmentation enhances data diversity. Phase and coherence-based functional connectivity features capture interchannel dependencies critical for imagined speech classification. The proposed wavelet-augmented phase coherence features with MLP-CBAM (WaveCoh-MLP-CBAM) framework is evaluated on an imagined speech dataset. The WaveCoh-MLP-CBAM shows superior classification accuracy, F1-score, and Cohen's kappa compared to conventional approaches. Results highlight the importance of augmentation, functional connectivity features, and attention in improving EEG-based imagined speech decoding.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"9 8","pages":"1-4"},"PeriodicalIF":2.2000,"publicationDate":"2025-07-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Wavelet Augmented Phase Coherence Features for EEG-Based Imagined Speech Classification\",\"authors\":\"Anand Mohan;R. S Anand\",\"doi\":\"10.1109/LSENS.2025.3591964\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Brain–computer interfaces (BCIs) provide direct communication between the brain and external devices. Using electroencephalogram (EEG) sensors, BCIs are applied in assistive technologies and neuroprosthetics. Among various BCI paradigms, imagined speech-based BCI aims to decode internal speech representations from EEG signals, enabling silent communication. Decoding imagined speech is challenging due to the nonstationarity, intersubject variability of EEG signals and low signal-to-noise ratio. The proposed method uses a multilayer perceptron (MLP) integrated with a convolutional block attention module (CBAM) to enhance feature learning by refining spatial and channel-wise attention. To further improve performance, wavelet-based augmentation enhances data diversity. Phase and coherence-based functional connectivity features capture interchannel dependencies critical for imagined speech classification. The proposed wavelet-augmented phase coherence features with MLP-CBAM (WaveCoh-MLP-CBAM) framework is evaluated on an imagined speech dataset. The WaveCoh-MLP-CBAM shows superior classification accuracy, F1-score, and Cohen's kappa compared to conventional approaches. Results highlight the importance of augmentation, functional connectivity features, and attention in improving EEG-based imagined speech decoding.\",\"PeriodicalId\":13014,\"journal\":{\"name\":\"IEEE Sensors Letters\",\"volume\":\"9 8\",\"pages\":\"1-4\"},\"PeriodicalIF\":2.2000,\"publicationDate\":\"2025-07-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Sensors Letters\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/11091402/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Letters","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/11091402/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
Wavelet Augmented Phase Coherence Features for EEG-Based Imagined Speech Classification
Brain–computer interfaces (BCIs) provide direct communication between the brain and external devices. Using electroencephalogram (EEG) sensors, BCIs are applied in assistive technologies and neuroprosthetics. Among various BCI paradigms, imagined speech-based BCI aims to decode internal speech representations from EEG signals, enabling silent communication. Decoding imagined speech is challenging due to the nonstationarity, intersubject variability of EEG signals and low signal-to-noise ratio. The proposed method uses a multilayer perceptron (MLP) integrated with a convolutional block attention module (CBAM) to enhance feature learning by refining spatial and channel-wise attention. To further improve performance, wavelet-based augmentation enhances data diversity. Phase and coherence-based functional connectivity features capture interchannel dependencies critical for imagined speech classification. The proposed wavelet-augmented phase coherence features with MLP-CBAM (WaveCoh-MLP-CBAM) framework is evaluated on an imagined speech dataset. The WaveCoh-MLP-CBAM shows superior classification accuracy, F1-score, and Cohen's kappa compared to conventional approaches. Results highlight the importance of augmentation, functional connectivity features, and attention in improving EEG-based imagined speech decoding.