GSAformer: Group sparse attention transformer for functional brain network analysis

IF 6.3 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Lina Zhou , Xiao Jiang , Mengxue Pang , Jinshan Zhang , Shuai Zhang , Chris Nugent , Lishan Qiao
{"title":"GSAformer: Group sparse attention transformer for functional brain network analysis","authors":"Lina Zhou ,&nbsp;Xiao Jiang ,&nbsp;Mengxue Pang ,&nbsp;Jinshan Zhang ,&nbsp;Shuai Zhang ,&nbsp;Chris Nugent ,&nbsp;Lishan Qiao","doi":"10.1016/j.neunet.2025.107891","DOIUrl":null,"url":null,"abstract":"<div><div>Functional brain network (FBN) analysis based on fMRI has proven effective for neurological/mental disorder classification. Traditional methods usually separate the FBN construction from the subsequent classification tasks, resulting in a suboptimal solution. Recently, transformers, known for their attention mechanisms, have shown strong performance in various tasks, including brain disorder classification. However, existing methods treat subjects independently, limiting the capture of their shared patterns. To address these issues, we propose GSAformer, a group sparse attention-based model for brain disorder diagnosis. Specifically, we first construct brain connectivity matrices for subjects using Pearson’s correlation, and then incorporate group sparse prior into the transformer to explicitly model inter-subject relationships. Group sparsity is applied across attention matrices to reduce parameters, improve the generalization, and enhance the interpretability. A maximum mean discrepancy (MMD) constraint is also introduced to ensure consistency between the learned attention matrices and the group sparse brain networks. Our framework integrates population-level prior knowledge, and supports end-to-end adaptive learning, while maintaining computational complexity on par with the standard Transformer and demonstrating enhanced capability in capturing group sparse topological structures among population. We evaluate the GSAformer on three public datasets for brain disorder classification. The classification performance of the proposed method is improved by 3.8%, 4.1% and 14.7% on the three datasets, respectively, compared with the standard Transformer.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"192 ","pages":"Article 107891"},"PeriodicalIF":6.3000,"publicationDate":"2025-07-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025007725","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Functional brain network (FBN) analysis based on fMRI has proven effective for neurological/mental disorder classification. Traditional methods usually separate the FBN construction from the subsequent classification tasks, resulting in a suboptimal solution. Recently, transformers, known for their attention mechanisms, have shown strong performance in various tasks, including brain disorder classification. However, existing methods treat subjects independently, limiting the capture of their shared patterns. To address these issues, we propose GSAformer, a group sparse attention-based model for brain disorder diagnosis. Specifically, we first construct brain connectivity matrices for subjects using Pearson’s correlation, and then incorporate group sparse prior into the transformer to explicitly model inter-subject relationships. Group sparsity is applied across attention matrices to reduce parameters, improve the generalization, and enhance the interpretability. A maximum mean discrepancy (MMD) constraint is also introduced to ensure consistency between the learned attention matrices and the group sparse brain networks. Our framework integrates population-level prior knowledge, and supports end-to-end adaptive learning, while maintaining computational complexity on par with the standard Transformer and demonstrating enhanced capability in capturing group sparse topological structures among population. We evaluate the GSAformer on three public datasets for brain disorder classification. The classification performance of the proposed method is improved by 3.8%, 4.1% and 14.7% on the three datasets, respectively, compared with the standard Transformer.
GSAformer:用于脑功能网络分析的组稀疏注意力转换器
基于功能磁共振成像(fMRI)的脑功能网络(FBN)分析已被证明是神经/精神障碍分类的有效方法。传统方法通常将FBN构建与后续分类任务分离,导致次优解。最近,以其注意力机制而闻名的变形金刚在各种任务中表现出了强劲的表现,包括大脑障碍分类。然而,现有的方法独立地处理主题,限制了对它们的共享模式的捕获。为了解决这些问题,我们提出了GSAformer,一个基于群体稀疏注意力的脑障碍诊断模型。具体来说,我们首先使用Pearson’s相关性构建受试者的大脑连接矩阵,然后将群体稀疏先验纳入到转换器中,以显式地建模受试者之间的关系。在注意矩阵上应用群稀疏性来减少参数、提高泛化和增强可解释性。为了保证学习到的注意矩阵与群体稀疏脑网络的一致性,还引入了最大平均差异约束。我们的框架集成了种群级的先验知识,并支持端到端自适应学习,同时保持了与标准Transformer相当的计算复杂度,并展示了在种群中捕获组稀疏拓扑结构的增强能力。我们在三个公共数据集上对GSAformer进行脑障碍分类评估。与标准Transformer相比,该方法在三个数据集上的分类性能分别提高了3.8%、4.1%和14.7%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Neural Networks
Neural Networks 工程技术-计算机:人工智能
CiteScore
13.90
自引率
7.70%
发文量
425
审稿时长
67 days
期刊介绍: Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信