{"title":"Recognition of emotional states induced by music videos based on nonlinear feature extraction and SOM classification","authors":"S. Hatamikia, A. Nasrabadi","doi":"10.1109/ICBME.2014.7043946","DOIUrl":null,"url":null,"abstract":"This research aims at investigating the relationship between Electroencephalogram (EEG) signals and human emotional states. A subject-independent emotion recognition system is proposed using EEG signals collected during emotional audio-visual inductions to classify different classes of continuous valence-arousal model. First, four feature extraction methods based on Approximate Entropy, Spectral entropy, Katz's fractal dimension and Petrosian's fractal dimension were used; then, a two-stage feature selection method based on Dunn index and Sequential forward feature selection algorithm (SFS) algorithm was used to select the most informative feature subsets. Self-Organization Map (SOM) classifier was used to classify different emotional classes with the use of 5-fold cross-validation. The best results were achieved using combination of all features by average accuracies of %68.92 and %71.25 for two classes of valence and arousal, respectively. Furthermore, a hierarchical model which was constructed of two classifiers was used for classifying 4 emotional classes of valence and arousal levels and the average accuracy of %55.15 was achieved.","PeriodicalId":434822,"journal":{"name":"2014 21th Iranian Conference on Biomedical Engineering (ICBME)","volume":"28 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"27","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 21th Iranian Conference on Biomedical Engineering (ICBME)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICBME.2014.7043946","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 27
Abstract
This research aims at investigating the relationship between Electroencephalogram (EEG) signals and human emotional states. A subject-independent emotion recognition system is proposed using EEG signals collected during emotional audio-visual inductions to classify different classes of continuous valence-arousal model. First, four feature extraction methods based on Approximate Entropy, Spectral entropy, Katz's fractal dimension and Petrosian's fractal dimension were used; then, a two-stage feature selection method based on Dunn index and Sequential forward feature selection algorithm (SFS) algorithm was used to select the most informative feature subsets. Self-Organization Map (SOM) classifier was used to classify different emotional classes with the use of 5-fold cross-validation. The best results were achieved using combination of all features by average accuracies of %68.92 and %71.25 for two classes of valence and arousal, respectively. Furthermore, a hierarchical model which was constructed of two classifiers was used for classifying 4 emotional classes of valence and arousal levels and the average accuracy of %55.15 was achieved.