CET-attention mechanism impact on the classification of EEG signals

IF 2.2 4区 计算机科学 Q3 TELECOMMUNICATIONS
Mouad Riyad, Abdellah Adib
{"title":"CET-attention mechanism impact on the classification of EEG signals","authors":"Mouad Riyad,&nbsp;Abdellah Adib","doi":"10.1007/s12243-025-01071-7","DOIUrl":null,"url":null,"abstract":"<div><p>The attention mechanism enables the processing of the data more efficiently by driving the neural networks to focus on the pertinent information. The increase in performance pushed their wide adoption, including for bio-signal. Multiple researchers explored their use of electroencephalography in many scenarios, including motor imagery. Despite the myriad of implementations, their achievement varies from one subject to another since the signals are delicate. In this paper, we extend our previous research (Riyad and Adib 2024) by suggesting a new implementation. The proposal employs the Convolutional Block Attention Module as a backbone with a few modifications adjusted for the nature of the electroencephalography. It uses three levels of attention that are performed on the channel, time, and electrode individually known as Channel Attention Module (CAM), Time Attention Module (TAM), and Electrode Attention Module (EAM). The compartmentalization authorizes the placing of the attention sub-block in diverse configurations, each with a specific order that impacts the extraction of the feature. Also, we suggest studying them with two structures, one with an early spatial filtering that uses the new block once and a late spatial filtering that uses the attention twice. For the experiments, we test on the dataset 2b of the BCI competition IV. The results show that placing the CAM first and feeding its output to the TAM and EAM boost the performance drastically. For optimal results, it is necessary to use the new attention once at the beginning of the network. Also, it permits an even classification of the classes compared with the others.</p></div>","PeriodicalId":50761,"journal":{"name":"Annals of Telecommunications","volume":"80 and networking","pages":"547 - 555"},"PeriodicalIF":2.2000,"publicationDate":"2025-02-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Annals of Telecommunications","FirstCategoryId":"94","ListUrlMain":"https://link.springer.com/article/10.1007/s12243-025-01071-7","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"TELECOMMUNICATIONS","Score":null,"Total":0}
引用次数: 0

Abstract

The attention mechanism enables the processing of the data more efficiently by driving the neural networks to focus on the pertinent information. The increase in performance pushed their wide adoption, including for bio-signal. Multiple researchers explored their use of electroencephalography in many scenarios, including motor imagery. Despite the myriad of implementations, their achievement varies from one subject to another since the signals are delicate. In this paper, we extend our previous research (Riyad and Adib 2024) by suggesting a new implementation. The proposal employs the Convolutional Block Attention Module as a backbone with a few modifications adjusted for the nature of the electroencephalography. It uses three levels of attention that are performed on the channel, time, and electrode individually known as Channel Attention Module (CAM), Time Attention Module (TAM), and Electrode Attention Module (EAM). The compartmentalization authorizes the placing of the attention sub-block in diverse configurations, each with a specific order that impacts the extraction of the feature. Also, we suggest studying them with two structures, one with an early spatial filtering that uses the new block once and a late spatial filtering that uses the attention twice. For the experiments, we test on the dataset 2b of the BCI competition IV. The results show that placing the CAM first and feeding its output to the TAM and EAM boost the performance drastically. For optimal results, it is necessary to use the new attention once at the beginning of the network. Also, it permits an even classification of the classes compared with the others.

cet -注意机制对脑电信号分类的影响
注意机制通过驱动神经网络将注意力集中在相关信息上,从而提高数据处理的效率。性能的提高推动了它们的广泛应用,包括生物信号。多名研究人员探索了脑电图在许多情况下的应用,包括运动意象。尽管有无数的实现,但它们的成就因主题而异,因为信号是微妙的。在本文中,我们通过提出一种新的实现来扩展我们之前的研究(Riyad和Adib 2024)。该建议采用卷积块注意模块作为主干,并根据脑电图的性质进行了一些修改。它使用三个级别的注意力,分别在通道、时间和电极上执行,分别称为通道注意力模块(CAM)、时间注意力模块(TAM)和电极注意力模块(EAM)。这种划分允许将注意力子块放置在不同的配置中,每个配置都有影响特征提取的特定顺序。此外,我们建议用两种结构来研究它们,一种是使用新块一次的早期空间滤波,另一种是使用两次注意力的后期空间滤波。对于实验,我们在BCI竞赛IV的数据集2b上进行了测试。结果表明,将CAM放在首位并将其输出馈送给TAM和EAM可以显着提高性能。为了获得最佳效果,有必要在网络开始时使用一次新的注意力。此外,与其他类相比,它允许对类进行均匀分类。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Annals of Telecommunications
Annals of Telecommunications 工程技术-电信学
CiteScore
5.20
自引率
5.30%
发文量
37
审稿时长
4.5 months
期刊介绍: Annals of Telecommunications is an international journal publishing original peer-reviewed papers in the field of telecommunications. It covers all the essential branches of modern telecommunications, ranging from digital communications to communication networks and the internet, to software, protocols and services, uses and economics. This large spectrum of topics accounts for the rapid convergence through telecommunications of the underlying technologies in computers, communications, content management towards the emergence of the information and knowledge society. As a consequence, the Journal provides a medium for exchanging research results and technological achievements accomplished by the European and international scientific community from academia and industry.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信