Dongjun Liu, Jin Cui, Zeyu Pan, Hangkui Zhang, Jianting Cao, Wanzeng Kong
{"title":"Machine to brain: facial expression recognition using brain machine generative adversarial networks.","authors":"Dongjun Liu, Jin Cui, Zeyu Pan, Hangkui Zhang, Jianting Cao, Wanzeng Kong","doi":"10.1007/s11571-023-09946-y","DOIUrl":null,"url":null,"abstract":"<p><p>The human brain can effectively perform Facial Expression Recognition (FER) with a few samples by utilizing its cognitive ability. However, unlike the human brain, even the well-trained deep neural network is data-dependent and lacks cognitive ability. To tackle this challenge, this paper proposes a novel framework, Brain Machine Generative Adversarial Networks (BM-GAN), which utilizes the concept of brain's cognitive ability to guide a Convolutional Neural Network to generate LIKE-electroencephalograph (EEG) features. More specifically, we firstly obtain EEG signals triggered from facial emotion images, then we adopt BM-GAN to carry out the mutual generation of image visual features and EEG cognitive features. BM-GAN intends to use the cognitive knowledge learnt from EEG signals to instruct the model to perceive LIKE-EEG features. Thereby, BM-GAN has a superior performance for FER like the human brain. The proposed model consists of VisualNet, EEGNet, and BM-GAN. More specifically, VisualNet can obtain image visual features from facial emotion images and EEGNet can obtain EEG cognitive features from EEG signals. Subsequently, the BM-GAN completes the mutual generation of image visual features and EEG cognitive features. Finally, the predicted LIKE-EEG features of test images are used for FER. After learning, without the participation of the EEG signals, an average classification accuracy of 96.6 % is obtained on Chinese Facial Affective Picture System dataset using LIKE-EEG features for FER. Experiments demonstrate that the proposed method can produce an excellent performance for FER.</p>","PeriodicalId":3,"journal":{"name":"ACS Applied Electronic Materials","volume":" ","pages":"863-875"},"PeriodicalIF":4.3000,"publicationDate":"2024-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11143176/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACS Applied Electronic Materials","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1007/s11571-023-09946-y","RegionNum":3,"RegionCategory":"材料科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2023/2/22 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
The human brain can effectively perform Facial Expression Recognition (FER) with a few samples by utilizing its cognitive ability. However, unlike the human brain, even the well-trained deep neural network is data-dependent and lacks cognitive ability. To tackle this challenge, this paper proposes a novel framework, Brain Machine Generative Adversarial Networks (BM-GAN), which utilizes the concept of brain's cognitive ability to guide a Convolutional Neural Network to generate LIKE-electroencephalograph (EEG) features. More specifically, we firstly obtain EEG signals triggered from facial emotion images, then we adopt BM-GAN to carry out the mutual generation of image visual features and EEG cognitive features. BM-GAN intends to use the cognitive knowledge learnt from EEG signals to instruct the model to perceive LIKE-EEG features. Thereby, BM-GAN has a superior performance for FER like the human brain. The proposed model consists of VisualNet, EEGNet, and BM-GAN. More specifically, VisualNet can obtain image visual features from facial emotion images and EEGNet can obtain EEG cognitive features from EEG signals. Subsequently, the BM-GAN completes the mutual generation of image visual features and EEG cognitive features. Finally, the predicted LIKE-EEG features of test images are used for FER. After learning, without the participation of the EEG signals, an average classification accuracy of 96.6 % is obtained on Chinese Facial Affective Picture System dataset using LIKE-EEG features for FER. Experiments demonstrate that the proposed method can produce an excellent performance for FER.
期刊介绍:
ACS Applied Electronic Materials is an interdisciplinary journal publishing original research covering all aspects of electronic materials. The journal is devoted to reports of new and original experimental and theoretical research of an applied nature that integrate knowledge in the areas of materials science, engineering, optics, physics, and chemistry into important applications of electronic materials. Sample research topics that span the journal's scope are inorganic, organic, ionic and polymeric materials with properties that include conducting, semiconducting, superconducting, insulating, dielectric, magnetic, optoelectronic, piezoelectric, ferroelectric and thermoelectric.
Indexed/Abstracted:
Web of Science SCIE
Scopus
CAS
INSPEC
Portico