Monica Perusquía-Hernández, S. Ayabe‐Kanamura, Kenji Suzuki, Shiro Kumano
{"title":"The Invisible Potential of Facial Electromyography: A Comparison of EMG and Computer Vision when Distinguishing Posed from Spontaneous Smiles","authors":"Monica Perusquía-Hernández, S. Ayabe‐Kanamura, Kenji Suzuki, Shiro Kumano","doi":"10.1145/3290605.3300379","DOIUrl":null,"url":null,"abstract":"Positive experiences are a success metric in product and service design. Quantifying smiles is a method of assessing them continuously. Smiles are usually a cue of positive affect, but they can also be fabricated voluntarily. Automatic detection is a promising complement to human perception in terms of identifying the differences between smile types. Computer vision (CV) and facial distal electromyography (EMG) have been proven successful in this task. This is the first study to use a wearable EMG that does not obstruct the face to compare the performance of CV and EMG measurements in the task of distinguishing between posed and spontaneous smiles. The results showed that EMG has the advantage of being able to identify covert behavior not available through vision. Moreover, CV appears to be able to identify visible dynamic features that human judges cannot account for. This sheds light on the role of non-observable behavior in distinguishing affect-related smiles from polite positive affect displays.","PeriodicalId":20454,"journal":{"name":"Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2019-05-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"25","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3290605.3300379","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 25
Abstract
Positive experiences are a success metric in product and service design. Quantifying smiles is a method of assessing them continuously. Smiles are usually a cue of positive affect, but they can also be fabricated voluntarily. Automatic detection is a promising complement to human perception in terms of identifying the differences between smile types. Computer vision (CV) and facial distal electromyography (EMG) have been proven successful in this task. This is the first study to use a wearable EMG that does not obstruct the face to compare the performance of CV and EMG measurements in the task of distinguishing between posed and spontaneous smiles. The results showed that EMG has the advantage of being able to identify covert behavior not available through vision. Moreover, CV appears to be able to identify visible dynamic features that human judges cannot account for. This sheds light on the role of non-observable behavior in distinguishing affect-related smiles from polite positive affect displays.