{"title":"Harnessing the Multi-Phasal Nature of Speech-EEG for Enhancing Imagined Speech Recognition","authors":"Rini Sharon;Mriganka Sur;Hema Murthy","doi":"10.1109/OJSP.2025.3528368","DOIUrl":null,"url":null,"abstract":"Analyzing speech-electroencephalogram (EEG) is pivotal for developing non-invasive and naturalistic brain-computer interfaces. Recognizing that the nature of human communication involves multiple phases like audition, imagination, articulation, and production, this study uncovers the shared cognitive imprints that represent speech cognition across these phases. Regression analysis, using correlation metrics reveal pronounced inter-phasal congruence. This insight promotes a shift from single-phase-centric recognition models to harnessing integrated phase data, thereby enhancing recognition of cognitive speech. Having established the presence of inter-phase associations, a common representation learning feature extractor is introduced, adept at capturing the correlations and replicability across phases. The features so extracted are observed to provide superior discrimination of cognitive speech units. Notably, the proposed approach proves resilient even in the absence of comprehensive multi-phasal data. Through thorough control checks and illustrative topographical visualizations, our observations are substantiated. The findings indicate that the proposed multi-phase approach significantly enhances EEG-based speech recognition, achieving an accuracy gain of 18.2% for 25 cognitive units in continuous speech EEG over models reliant solely on single-phase data.","PeriodicalId":73300,"journal":{"name":"IEEE open journal of signal processing","volume":"6 ","pages":"78-88"},"PeriodicalIF":2.9000,"publicationDate":"2025-01-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10839023","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE open journal of signal processing","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10839023/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Analyzing speech-electroencephalogram (EEG) is pivotal for developing non-invasive and naturalistic brain-computer interfaces. Recognizing that the nature of human communication involves multiple phases like audition, imagination, articulation, and production, this study uncovers the shared cognitive imprints that represent speech cognition across these phases. Regression analysis, using correlation metrics reveal pronounced inter-phasal congruence. This insight promotes a shift from single-phase-centric recognition models to harnessing integrated phase data, thereby enhancing recognition of cognitive speech. Having established the presence of inter-phase associations, a common representation learning feature extractor is introduced, adept at capturing the correlations and replicability across phases. The features so extracted are observed to provide superior discrimination of cognitive speech units. Notably, the proposed approach proves resilient even in the absence of comprehensive multi-phasal data. Through thorough control checks and illustrative topographical visualizations, our observations are substantiated. The findings indicate that the proposed multi-phase approach significantly enhances EEG-based speech recognition, achieving an accuracy gain of 18.2% for 25 cognitive units in continuous speech EEG over models reliant solely on single-phase data.