{"title":"使用心肺信号来识别观看音乐视频剪辑时引发的情绪","authors":"Leila Mirmohamadsadeghi, A. Yazdani, J. Vesin","doi":"10.1109/MMSP.2016.7813349","DOIUrl":null,"url":null,"abstract":"The automatic recognition of human emotions from physiological signals is of increasing interest in many applications. Images with high emotional content have been shown to alter signals such as the electrocardiogram (ECG) and the respiration among many other physiological recordings. However, recognizing emotions from multimedia stimuli, such as music video clips, which are growing in numbers in the digital world and are the medium of many recommendation systems, has not been adequately investigated. This study aims to investigate the recognition of emotions elicited by watching music video clips, from features extracted from the ECG, the respiration and several synchronization aspects of the two. On a public dataset, we achieved higher classification rates than the state-of-the-art using either the ECG or the respiration signals alone. A feature related to the synchronization of the two signals achieved even better performance.","PeriodicalId":113192,"journal":{"name":"2016 IEEE 18th International Workshop on Multimedia Signal Processing (MMSP)","volume":"38 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"22","resultStr":"{\"title\":\"Using cardio-respiratory signals to recognize emotions elicited by watching music video clips\",\"authors\":\"Leila Mirmohamadsadeghi, A. Yazdani, J. Vesin\",\"doi\":\"10.1109/MMSP.2016.7813349\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The automatic recognition of human emotions from physiological signals is of increasing interest in many applications. Images with high emotional content have been shown to alter signals such as the electrocardiogram (ECG) and the respiration among many other physiological recordings. However, recognizing emotions from multimedia stimuli, such as music video clips, which are growing in numbers in the digital world and are the medium of many recommendation systems, has not been adequately investigated. This study aims to investigate the recognition of emotions elicited by watching music video clips, from features extracted from the ECG, the respiration and several synchronization aspects of the two. On a public dataset, we achieved higher classification rates than the state-of-the-art using either the ECG or the respiration signals alone. A feature related to the synchronization of the two signals achieved even better performance.\",\"PeriodicalId\":113192,\"journal\":{\"name\":\"2016 IEEE 18th International Workshop on Multimedia Signal Processing (MMSP)\",\"volume\":\"38 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"22\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 IEEE 18th International Workshop on Multimedia Signal Processing (MMSP)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/MMSP.2016.7813349\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 IEEE 18th International Workshop on Multimedia Signal Processing (MMSP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MMSP.2016.7813349","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Using cardio-respiratory signals to recognize emotions elicited by watching music video clips
The automatic recognition of human emotions from physiological signals is of increasing interest in many applications. Images with high emotional content have been shown to alter signals such as the electrocardiogram (ECG) and the respiration among many other physiological recordings. However, recognizing emotions from multimedia stimuli, such as music video clips, which are growing in numbers in the digital world and are the medium of many recommendation systems, has not been adequately investigated. This study aims to investigate the recognition of emotions elicited by watching music video clips, from features extracted from the ECG, the respiration and several synchronization aspects of the two. On a public dataset, we achieved higher classification rates than the state-of-the-art using either the ECG or the respiration signals alone. A feature related to the synchronization of the two signals achieved even better performance.