{"title":"Adversarial Discriminative Domain Adaptation and Transformers for EEG-based Cross-Subject Emotion Recognition","authors":"Shadi Sartipi, M. Çetin","doi":"10.1109/NER52421.2023.10123837","DOIUrl":null,"url":null,"abstract":"Decoding the human emotional states based on electroencephalography (EEG) in affective brain-computer interfaces (BCI) is a great challenge due to inter-subject variability. Existing methods mostly use large amounts of EEG data of each new subject to calibrate the algorithm, which could be time-consuming and not user-oriented. To address this issue, we propose a combination of using transformers (TF) and adversarial discriminative domain adaptation (ADDA) to perform the emotion recognition task in a cross-subject manner. TF principally relies on the attention mechanism. Our proposed approach performs scaledot product attention on the feature-channel aspect of EEG data to improve the spatial features. Then, the temporal transforming is applied to get the global discriminative representations from the time component. Moreover, ADDA aims to minimize the discrepancy of EEG data from various subjects. We evaluate the proposed ADDA-TF on the publicly available DEAP dataset and demonstrate the improvements it provides on low versus high valence and arousal classification.","PeriodicalId":201841,"journal":{"name":"2023 11th International IEEE/EMBS Conference on Neural Engineering (NER)","volume":"193 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-04-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 11th International IEEE/EMBS Conference on Neural Engineering (NER)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NER52421.2023.10123837","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Decoding the human emotional states based on electroencephalography (EEG) in affective brain-computer interfaces (BCI) is a great challenge due to inter-subject variability. Existing methods mostly use large amounts of EEG data of each new subject to calibrate the algorithm, which could be time-consuming and not user-oriented. To address this issue, we propose a combination of using transformers (TF) and adversarial discriminative domain adaptation (ADDA) to perform the emotion recognition task in a cross-subject manner. TF principally relies on the attention mechanism. Our proposed approach performs scaledot product attention on the feature-channel aspect of EEG data to improve the spatial features. Then, the temporal transforming is applied to get the global discriminative representations from the time component. Moreover, ADDA aims to minimize the discrepancy of EEG data from various subjects. We evaluate the proposed ADDA-TF on the publicly available DEAP dataset and demonstrate the improvements it provides on low versus high valence and arousal classification.