Yihe Wang, Nadia Mammone, Darina Petrovsky, Alexandros T. Tzallas, Francesco C. Morabito, Xiang Zhang
{"title":"ADformer: A Multi-Granularity Transformer for EEG-Based Alzheimer's Disease Assessment","authors":"Yihe Wang, Nadia Mammone, Darina Petrovsky, Alexandros T. Tzallas, Francesco C. Morabito, Xiang Zhang","doi":"arxiv-2409.00032","DOIUrl":null,"url":null,"abstract":"Electroencephalogram (EEG) has emerged as a cost-effective and efficient\nmethod for supporting neurologists in assessing Alzheimer's disease (AD).\nExisting approaches predominantly utilize handcrafted features or Convolutional\nNeural Network (CNN)-based methods. However, the potential of the transformer\narchitecture, which has shown promising results in various time series analysis\ntasks, remains underexplored in interpreting EEG for AD assessment.\nFurthermore, most studies are evaluated on the subject-dependent setup but\noften overlook the significance of the subject-independent setup. To address\nthese gaps, we present ADformer, a novel multi-granularity transformer designed\nto capture temporal and spatial features to learn effective EEG\nrepresentations. We employ multi-granularity data embedding across both\ndimensions and utilize self-attention to learn local features within each\ngranularity and global features among different granularities. We conduct\nexperiments across 5 datasets with a total of 525 subjects in setups including\nsubject-dependent, subject-independent, and leave-subjects-out. Our results\nshow that ADformer outperforms existing methods in most evaluations, achieving\nF1 scores of 75.19% and 93.58% on two large datasets with 65 subjects and 126\nsubjects, respectively, in distinguishing AD and healthy control (HC) subjects\nunder the challenging subject-independent setup.","PeriodicalId":501309,"journal":{"name":"arXiv - CS - Computational Engineering, Finance, and Science","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-08-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Computational Engineering, Finance, and Science","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.00032","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Electroencephalogram (EEG) has emerged as a cost-effective and efficient
method for supporting neurologists in assessing Alzheimer's disease (AD).
Existing approaches predominantly utilize handcrafted features or Convolutional
Neural Network (CNN)-based methods. However, the potential of the transformer
architecture, which has shown promising results in various time series analysis
tasks, remains underexplored in interpreting EEG for AD assessment.
Furthermore, most studies are evaluated on the subject-dependent setup but
often overlook the significance of the subject-independent setup. To address
these gaps, we present ADformer, a novel multi-granularity transformer designed
to capture temporal and spatial features to learn effective EEG
representations. We employ multi-granularity data embedding across both
dimensions and utilize self-attention to learn local features within each
granularity and global features among different granularities. We conduct
experiments across 5 datasets with a total of 525 subjects in setups including
subject-dependent, subject-independent, and leave-subjects-out. Our results
show that ADformer outperforms existing methods in most evaluations, achieving
F1 scores of 75.19% and 93.58% on two large datasets with 65 subjects and 126
subjects, respectively, in distinguishing AD and healthy control (HC) subjects
under the challenging subject-independent setup.