{"title":"CLAS: A Database for Cognitive Load, Affect and Stress Recognition","authors":"V. Markova, T. Ganchev, Kalin Kalinkov","doi":"10.1109/BIA48344.2019.8967457","DOIUrl":null,"url":null,"abstract":"We present the overall design and the implementation of the CLAS dataset, a multimodal resource which was purposely developed in support of research and technology development (RTD) activities oriented towards the automated recognition of some specific states of mind. Although the particular focus of our research is on the states of mind associated with negative emotions, mental strain and high cognitive effort, the CLAS dataset could offer an adequate support to research of a wider scope, such as general studies on attention assessment, cognitive load assessment, emotion recognition, as well as stress detection. The dataset consists of synchronized recordings of physiological signals, such as Electrocardiography (ECG), Plethysmography (PPG), ElectroDermal Activity (EDA), as well as accelerometer data, and metadata of 62 healthy volunteers, which were recorded while involved in three interactive tasks and two perceptive tasks. The interactive tasks aim to elicit different types of cognitive effort and included solving sequences of Math problems, Logic problems and the Stroop test. The perceptive tasks make use of images and audio-video stimuli, purposely selected to evoke emotions in the four quadrants of the arousal-valence space. The joint analysis of success rates in the interactive tasks and the information acquired through the questionnaire and the physiological recordings enables for a multifaceted evaluation of specific states of mind. These results are important for the advancement of research on efficient human-robot collaborations and general research on intelligent human-machine interaction interfaces.","PeriodicalId":6688,"journal":{"name":"2019 International Conference on Biomedical Innovations and Applications (BIA)","volume":"62 2 1","pages":"1-4"},"PeriodicalIF":0.0000,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"60","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 International Conference on Biomedical Innovations and Applications (BIA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/BIA48344.2019.8967457","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 60
Abstract
We present the overall design and the implementation of the CLAS dataset, a multimodal resource which was purposely developed in support of research and technology development (RTD) activities oriented towards the automated recognition of some specific states of mind. Although the particular focus of our research is on the states of mind associated with negative emotions, mental strain and high cognitive effort, the CLAS dataset could offer an adequate support to research of a wider scope, such as general studies on attention assessment, cognitive load assessment, emotion recognition, as well as stress detection. The dataset consists of synchronized recordings of physiological signals, such as Electrocardiography (ECG), Plethysmography (PPG), ElectroDermal Activity (EDA), as well as accelerometer data, and metadata of 62 healthy volunteers, which were recorded while involved in three interactive tasks and two perceptive tasks. The interactive tasks aim to elicit different types of cognitive effort and included solving sequences of Math problems, Logic problems and the Stroop test. The perceptive tasks make use of images and audio-video stimuli, purposely selected to evoke emotions in the four quadrants of the arousal-valence space. The joint analysis of success rates in the interactive tasks and the information acquired through the questionnaire and the physiological recordings enables for a multifaceted evaluation of specific states of mind. These results are important for the advancement of research on efficient human-robot collaborations and general research on intelligent human-machine interaction interfaces.