X. Laura Cang;Rubia R. Guerra;Bereket Guta;Paul Bucci;Laura Rodgers;Hailey Mah;Qianqian Feng;Anushka Agrawal;Karon E. MacLean
{"title":"FEELing (key)Pressed: Implicit Touch Pressure Bests Brain Activity for Modeling Emotion Dynamics in the Space Between Stressed & Relaxed","authors":"X. Laura Cang;Rubia R. Guerra;Bereket Guta;Paul Bucci;Laura Rodgers;Hailey Mah;Qianqian Feng;Anushka Agrawal;Karon E. MacLean","doi":"10.1109/TOH.2023.3308059","DOIUrl":null,"url":null,"abstract":"In-body lived emotional experiences can be complex, with time-varying and dissonant emotions evolving simultaneously; devices responding in real-time to estimate personal human emotion should evolve accordingly. Models assuming generalized emotions exist as discrete states fail to operationalize valuable information inherent in the dynamic and individualistic nature of human emotions. Our multi-resolution emotion self-reporting procedure allows the construction of emotion labels along the Stressed-Relaxed scale, differentiating not only what the emotions are, but how they are transitioning – e.g., “hopeful but getting stressed” vs. “hopeful and starting to relax”. We trained participant-dependent hierarchical models of contextualized individual experience to compare emotion classification by modality (brain activity and keypress force from a physical keyboard), then benchmarked classification performance at F1-scores = [0.44, 0.82] (chance \n<inline-formula><tex-math>$F1=0.22$</tex-math></inline-formula>\n, \n<inline-formula><tex-math>$\\sigma =0.01$</tex-math></inline-formula>\n) and examined high-performing features. Notably, when classifying emotion evolution in the context of an experience that realistically varies in stress, pressure-based features from keypress force proved to be the more informative modality, and more convenient when considering intrusiveness and ease of collection and processing. Finally, we present our FEEL (Force, EEG and Emotion-Labelled) dataset, a collection of brain activity and keypress force data, labelled with self-reported emotion collected during tense videogame play (N = 16) and open-sourced for community exploration.","PeriodicalId":13215,"journal":{"name":"IEEE Transactions on Haptics","volume":"17 3","pages":"310-318"},"PeriodicalIF":2.4000,"publicationDate":"2023-09-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Haptics","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10239097/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, CYBERNETICS","Score":null,"Total":0}
引用次数: 0
Abstract
In-body lived emotional experiences can be complex, with time-varying and dissonant emotions evolving simultaneously; devices responding in real-time to estimate personal human emotion should evolve accordingly. Models assuming generalized emotions exist as discrete states fail to operationalize valuable information inherent in the dynamic and individualistic nature of human emotions. Our multi-resolution emotion self-reporting procedure allows the construction of emotion labels along the Stressed-Relaxed scale, differentiating not only what the emotions are, but how they are transitioning – e.g., “hopeful but getting stressed” vs. “hopeful and starting to relax”. We trained participant-dependent hierarchical models of contextualized individual experience to compare emotion classification by modality (brain activity and keypress force from a physical keyboard), then benchmarked classification performance at F1-scores = [0.44, 0.82] (chance
$F1=0.22$
,
$\sigma =0.01$
) and examined high-performing features. Notably, when classifying emotion evolution in the context of an experience that realistically varies in stress, pressure-based features from keypress force proved to be the more informative modality, and more convenient when considering intrusiveness and ease of collection and processing. Finally, we present our FEEL (Force, EEG and Emotion-Labelled) dataset, a collection of brain activity and keypress force data, labelled with self-reported emotion collected during tense videogame play (N = 16) and open-sourced for community exploration.
期刊介绍:
IEEE Transactions on Haptics (ToH) is a scholarly archival journal that addresses the science, technology, and applications associated with information acquisition and object manipulation through touch. Haptic interactions relevant to this journal include all aspects of manual exploration and manipulation of objects by humans, machines and interactions between the two, performed in real, virtual, teleoperated or networked environments. Research areas of relevance to this publication include, but are not limited to, the following topics: Human haptic and multi-sensory perception and action, Aspects of motor control that explicitly pertain to human haptics, Haptic interactions via passive or active tools and machines, Devices that sense, enable, or create haptic interactions locally or at a distance, Haptic rendering and its association with graphic and auditory rendering in virtual reality, Algorithms, controls, and dynamics of haptic devices, users, and interactions between the two, Human-machine performance and safety with haptic feedback, Haptics in the context of human-computer interactions, Systems and networks using haptic devices and interactions, including multi-modal feedback, Application of the above, for example in areas such as education, rehabilitation, medicine, computer-aided design, skills training, computer games, driver controls, simulation, and visualization.