{"title":"基于人脑电图和人脸识别的电影预告片质量客观评价研究","authors":"Qing Wu, Wenbing Zhao, Tessadori Jacopo","doi":"10.1109/EIT.2018.8500283","DOIUrl":null,"url":null,"abstract":"In this paper, we propose a novel framework to objectively evaluate the quality of movie trailers by fusing two sensing modalities: (1) Human Electroencephalogram (EEG), and (2) computer-vision based facial expression recognition. The EEG sensing data are acquired via a cap instrumented with a set of 4-channel EEG sensors from the OpenBCI Ganglion board. The facial expressions are captured while a user is watching a movie trailer using a regular webcam to help establish the context for EEG analysis. On their own, facial expressions reveal how engaged a user is while watching a movie trailer. Additionally, facial expression data help us identify situations where noises caused by muscle movement in EEG data. Using a shallow neural network, we classify facial expressions into two categories: positive and negative emotions. A quarter-central decision making strategy model is used to analyze EEG signals with a low pass filter activated by time stamp when large human movements are detected. A small human subject test showed that the adaptive analysis method can achieve higher accuracy than that obtained via EEG alone. Besides for movie trailer evaluation, this framework can be utilized in the future towards remote training evaluation, wearable device personalization, and assisting paralyzed people to communicate with others.","PeriodicalId":188414,"journal":{"name":"2018 IEEE International Conference on Electro/Information Technology (EIT)","volume":"35 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-05-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":"{\"title\":\"Towards Objective Assessment of Movie Trailer Quality Using Human Electroencephalogram and Facial Recognition\",\"authors\":\"Qing Wu, Wenbing Zhao, Tessadori Jacopo\",\"doi\":\"10.1109/EIT.2018.8500283\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, we propose a novel framework to objectively evaluate the quality of movie trailers by fusing two sensing modalities: (1) Human Electroencephalogram (EEG), and (2) computer-vision based facial expression recognition. The EEG sensing data are acquired via a cap instrumented with a set of 4-channel EEG sensors from the OpenBCI Ganglion board. The facial expressions are captured while a user is watching a movie trailer using a regular webcam to help establish the context for EEG analysis. On their own, facial expressions reveal how engaged a user is while watching a movie trailer. Additionally, facial expression data help us identify situations where noises caused by muscle movement in EEG data. Using a shallow neural network, we classify facial expressions into two categories: positive and negative emotions. A quarter-central decision making strategy model is used to analyze EEG signals with a low pass filter activated by time stamp when large human movements are detected. A small human subject test showed that the adaptive analysis method can achieve higher accuracy than that obtained via EEG alone. Besides for movie trailer evaluation, this framework can be utilized in the future towards remote training evaluation, wearable device personalization, and assisting paralyzed people to communicate with others.\",\"PeriodicalId\":188414,\"journal\":{\"name\":\"2018 IEEE International Conference on Electro/Information Technology (EIT)\",\"volume\":\"35 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-05-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"5\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 IEEE International Conference on Electro/Information Technology (EIT)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/EIT.2018.8500283\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE International Conference on Electro/Information Technology (EIT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/EIT.2018.8500283","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Towards Objective Assessment of Movie Trailer Quality Using Human Electroencephalogram and Facial Recognition
In this paper, we propose a novel framework to objectively evaluate the quality of movie trailers by fusing two sensing modalities: (1) Human Electroencephalogram (EEG), and (2) computer-vision based facial expression recognition. The EEG sensing data are acquired via a cap instrumented with a set of 4-channel EEG sensors from the OpenBCI Ganglion board. The facial expressions are captured while a user is watching a movie trailer using a regular webcam to help establish the context for EEG analysis. On their own, facial expressions reveal how engaged a user is while watching a movie trailer. Additionally, facial expression data help us identify situations where noises caused by muscle movement in EEG data. Using a shallow neural network, we classify facial expressions into two categories: positive and negative emotions. A quarter-central decision making strategy model is used to analyze EEG signals with a low pass filter activated by time stamp when large human movements are detected. A small human subject test showed that the adaptive analysis method can achieve higher accuracy than that obtained via EEG alone. Besides for movie trailer evaluation, this framework can be utilized in the future towards remote training evaluation, wearable device personalization, and assisting paralyzed people to communicate with others.