Journal of Nonverbal Behavior最新文献

筛选
英文 中文
A Review of Automatic Lie Detection from Facial Features 从面部特征自动检测谎言综述
IF 2.1 3区 心理学
Journal of Nonverbal Behavior Pub Date : 2024-03-23 DOI: 10.1007/s10919-024-00451-2
Hugues Delmas, Vincent Denault, Judee K. Burgoon, Norah E. Dunbar
{"title":"A Review of Automatic Lie Detection from Facial Features","authors":"Hugues Delmas, Vincent Denault, Judee K. Burgoon, Norah E. Dunbar","doi":"10.1007/s10919-024-00451-2","DOIUrl":"https://doi.org/10.1007/s10919-024-00451-2","url":null,"abstract":"<p>The growth of machine learning and artificial intelligence has made it possible for automatic lie detection systems to emerge. These can be based on a variety of cues, such as facial features. However, there is a lack of knowledge about both the development and the accuracy of such systems. To address this lack, we conducted a review of studies that have investigated automatic lie detection systems by using facial features. Our analysis of twenty-eight eligible studies focused on four main categories: dataset features, facial features used, classifier features and publication features. Overall, the findings showed that automatic lie detection systems rely on diverse technologies, facial features, and measurements. They are mainly based on factual lies, regardless of the stakes involved. On average, these automatic systems were based on a dataset of 52 individuals and achieved an average accuracy ranging from 61.87% to 72.93% in distinguishing between truth-tellers and liars, depending on the types of classifiers used. However, although the leakage hypothesis was the most used explanatory framework, many studies did not provide sufficient theoretical justification for the choice of facial features and their measurements. Bridging the gap between psychology and the computational-engineering field should help to combine theoretical frameworks with technical advancements in this area.</p>","PeriodicalId":47747,"journal":{"name":"Journal of Nonverbal Behavior","volume":"196 1","pages":""},"PeriodicalIF":2.1,"publicationDate":"2024-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140202661","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Facial and Body Posture Emotion Identification in Deaf and Hard-of-Hearing Young Adults 聋人和重听青少年的面部和身体姿势情绪识别
IF 2.1 3区 心理学
Journal of Nonverbal Behavior Pub Date : 2024-03-12 DOI: 10.1007/s10919-024-00458-9
Brittany A. Blose, Lindsay S. Schenkel
{"title":"Facial and Body Posture Emotion Identification in Deaf and Hard-of-Hearing Young Adults","authors":"Brittany A. Blose, Lindsay S. Schenkel","doi":"10.1007/s10919-024-00458-9","DOIUrl":"https://doi.org/10.1007/s10919-024-00458-9","url":null,"abstract":"<p>The aim of the current study was to examine facial and body posture emotion recognition among deaf and hard-of-hearing (DHH) and hearing young adults. Participants were (<i>N</i> = 126) DHH (<i>n</i> = 48) and hearing (<i>n</i> = 78) college students who completed two emotion recognition tasks in which they were shown photographs of faces and body postures displaying different emotions of both high and low intensities and had to infer the emotion being displayed. Compared to hearing participants, DHH participants performed worse on the body postures emotion task for both high and low intensities. They also performed more poorly on the facial emotion task, but only for low-intensity emotional facial expressions. On both tasks, DHH participants whose primary mode of communication was Signed English performed significantly more poorly than those whose primary mode was American Sign Language (ASL) or spoken English. Moreover, DHH participants who communicated using ASL performed similarly to hearing participants. This suggests that difficulties in affect recognition among DHH individuals occur when processing both facial and body postures that are more subtle and reflective of real-life displays of emotion. Importantly, this also suggests that ASL as a primary form of communication in this population may serve as a protective factor against emotion recognition difficulties, which could, in part, be due to the complex nature of this language and its requirement to perceive meaning through facial and postural expressions with a wide visual lens.</p>","PeriodicalId":47747,"journal":{"name":"Journal of Nonverbal Behavior","volume":"149 1","pages":""},"PeriodicalIF":2.1,"publicationDate":"2024-03-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140146790","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A Tutorial for Deception Detection Analysis or: How I Learned to Stop Aggregating Veracity Judgments and Embraced Signal Detection Theory Mixed Models 欺骗检测分析教程或:我是如何学会停止汇总真实性判断并接受信号检测理论混合模型的?
IF 2.1 3区 心理学
Journal of Nonverbal Behavior Pub Date : 2024-03-01 DOI: 10.1007/s10919-024-00456-x
Mircea Zloteanu, Matti Vuorre
{"title":"A Tutorial for Deception Detection Analysis or: How I Learned to Stop Aggregating Veracity Judgments and Embraced Signal Detection Theory Mixed Models","authors":"Mircea Zloteanu, Matti Vuorre","doi":"10.1007/s10919-024-00456-x","DOIUrl":"https://doi.org/10.1007/s10919-024-00456-x","url":null,"abstract":"<p>Historically, deception detection research has relied on factorial analyses of response accuracy to make inferences. However, this practice overlooks important sources of variability resulting in potentially misleading estimates and may conflate response bias with participants’ underlying sensitivity to detect lies from truths. We showcase an alternative approach using a signal detection theory (SDT) with generalized linear mixed models framework to address these limitations. This SDT approach incorporates individual differences from both judges and senders, which are a principal source of spurious findings in deception research. By avoiding data transformations and aggregations, this methodology outperforms traditional methods and provides more informative and reliable effect estimates. This well-established framework offers researchers a powerful tool for analyzing deception data and advances our understanding of veracity judgments. All code and data are openly available.</p>","PeriodicalId":47747,"journal":{"name":"Journal of Nonverbal Behavior","volume":"53 1","pages":""},"PeriodicalIF":2.1,"publicationDate":"2024-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140017387","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Introduction to the Special Issue on Innovations in Nonverbal Deception Research: Promising Avenues for Advancing the Field 非言语欺骗研究创新特刊导言:推动该领域发展的可行途径
IF 2.1 3区 心理学
Journal of Nonverbal Behavior Pub Date : 2024-02-28 DOI: 10.1007/s10919-024-00457-w
Sally D. Farley
{"title":"Introduction to the Special Issue on Innovations in Nonverbal Deception Research: Promising Avenues for Advancing the Field","authors":"Sally D. Farley","doi":"10.1007/s10919-024-00457-w","DOIUrl":"https://doi.org/10.1007/s10919-024-00457-w","url":null,"abstract":"<p>Ekman and Friesen’s (1969) seminal theoretical paper on the leakage hierarchy sparked decades of research on the relationship between nonverbal cues and deception. Yet skepticism over the strength and reliability of behavioral cues to deception has been building over the years (DePaulo et al., 2003; Patterson et al., 2023; Vrij et al., 2019). However, the last two decades have seen dramatic growth in research paradigms, interviewing techniques, integration of technology, automated coding methods, and facial research, suggesting a need for reexamination of the current state of the field. This special issue includes theoretical and empirical papers that advance our understanding of the link between nonverbal cues and deception. This collection of papers suggests there is cause for some optimism in the field of nonverbal deception detection and signals some fruitful avenues for future research. Specifically, deception research in ecologically valid, high-stakes lie-detection situations using a multi-modal approach has good promise for differentiating truth-tellers from liars.</p>","PeriodicalId":47747,"journal":{"name":"Journal of Nonverbal Behavior","volume":"23 1","pages":""},"PeriodicalIF":2.1,"publicationDate":"2024-02-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140004017","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Touch as a Stress Buffer? Gender Differences in Subjective and Physiological Responses to Partner and Stranger Touch 触摸是压力缓冲器?对伴侣和陌生人触摸的主观和生理反应的性别差异
IF 2.1 3区 心理学
Journal of Nonverbal Behavior Pub Date : 2024-02-17 DOI: 10.1007/s10919-024-00455-y
Anik Debrot, Jennifer E. Stellar, Elise Dan-Glauser, Petra L. Klumb
{"title":"Touch as a Stress Buffer? Gender Differences in Subjective and Physiological Responses to Partner and Stranger Touch","authors":"Anik Debrot, Jennifer E. Stellar, Elise Dan-Glauser, Petra L. Klumb","doi":"10.1007/s10919-024-00455-y","DOIUrl":"https://doi.org/10.1007/s10919-024-00455-y","url":null,"abstract":"<p>Interpersonal touch buffers against stress under challenging conditions, but this effect depends on familiarity. People benefit from receiving touch from their romantic partners, but the results are less consistent in the context of receiving touch from an opposite-gender stranger. We propose that there may be important gender differences in how people respond to touch from opposite-gender strangers. Specifically, we propose that touch from an opposite-gender stranger may only have stress-buffering effects for men, not women. Stress was induced as participants took part in an emotion recognition task in which they received false failure feedback while being touched by a romantic partner or stranger. We measured subjective and physiological markers of stress (i.e., reduced heart rate variability) throughout the experiment. Neither stranger’s nor partner’s touch had any effect on subjective or physiological markers of stress for men. Women, however, subjectively experienced a stress-buffering effect of partner and stranger touch, but showed increased physiological markers of stress when receiving touch from an opposite-gender stranger. These results highlight the importance of considering gender when investigating touch as a stress buffer.</p>","PeriodicalId":47747,"journal":{"name":"Journal of Nonverbal Behavior","volume":"7 1","pages":""},"PeriodicalIF":2.1,"publicationDate":"2024-02-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139903105","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
To Nod or Not to Nod: How Does Interviewer Nonverbal Behavior Affect Rapport Perceptions and Recall in Truth Tellers and Lie Tellers? 点头还是不点头:采访者的非语言行为如何影响说实话者和说谎者的关系感知和回忆?
IF 2.1 3区 心理学
Journal of Nonverbal Behavior Pub Date : 2024-01-31 DOI: 10.1007/s10919-024-00452-1
Haneen Deeb, Sharon Leal, Aldert Vrij, Samantha Mann, Oliwia Dabrowna
{"title":"To Nod or Not to Nod: How Does Interviewer Nonverbal Behavior Affect Rapport Perceptions and Recall in Truth Tellers and Lie Tellers?","authors":"Haneen Deeb, Sharon Leal, Aldert Vrij, Samantha Mann, Oliwia Dabrowna","doi":"10.1007/s10919-024-00452-1","DOIUrl":"https://doi.org/10.1007/s10919-024-00452-1","url":null,"abstract":"<p>Researchers have often claimed that the interviewer’s nonverbal behavior such as nodding facilitates rapport building, the number of recalled details, and verbal veracity cues. However, there is no experiment to-date that isolated the effects of nodding in information gathering interviews. We thus examined the effects of interviewer’s nodding behavior on rapport perceptions and on the number and accuracy of total details provided by truth tellers and lie tellers. Participants (<i>N</i> = 150) watched a video recording and then reported it truthfully or falsely to an interviewer. The interviewer showed demeanor that was either supportive with nodding, supportive without nodding, or neutral. Truth tellers reported more total details than lie tellers and these effects were similar across demeanor conditions. No significant effects emerged for rapport perceptions and accuracy of total details. These results suggest that the interviewer’s nodding behavior does not affect rapport perceptions and details provided by truth tellers and lie tellers.</p>","PeriodicalId":47747,"journal":{"name":"Journal of Nonverbal Behavior","volume":"152 1","pages":""},"PeriodicalIF":2.1,"publicationDate":"2024-01-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139647724","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Mining Bodily Cues to Deception 挖掘欺骗的身体线索
IF 2.1 3区 心理学
Journal of Nonverbal Behavior Pub Date : 2024-01-16 DOI: 10.1007/s10919-023-00450-9
{"title":"Mining Bodily Cues to Deception","authors":"","doi":"10.1007/s10919-023-00450-9","DOIUrl":"https://doi.org/10.1007/s10919-023-00450-9","url":null,"abstract":"<h3>Abstract</h3> <p>A significant body of research has investigated potential correlates of deception and bodily behavior. The vast majority of these studies consider discrete, subjectively coded bodily movements such as specific hand or head gestures. Such studies fail to consider quantitative aspects of body movement such as the precise movement direction, magnitude and timing. In this paper, we employ an innovative data mining approach to systematically study bodily correlates of deception. We re-analyze motion capture data from a previously published deception study, and experiment with different data coding options. We report how deception detection rates are affected by variables such as body part, the coding of the pose and movement, the length of the observation, and the amount of measurement noise. Our results demonstrate the feasibility of a data mining approach, with detection rates above 65%, significantly outperforming human judgement (52.80%). Owing to the systematic analysis, our analyses allow for an understanding of the importance of various coding factor. Moreover, we can reconcile seemingly discrepant findings in previous research. Our approach highlights the merits of data-driven research to support the validation and development of deception theory.</p>","PeriodicalId":47747,"journal":{"name":"Journal of Nonverbal Behavior","volume":"99 1","pages":""},"PeriodicalIF":2.1,"publicationDate":"2024-01-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139500557","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A Dynamic Disadvantage? Social Perceptions of Dynamic Morphed Emotions Differ from Videos and Photos 动态劣势?动态变形情绪的社会认知与视频和照片不同
IF 2.1 3区 心理学
Journal of Nonverbal Behavior Pub Date : 2024-01-13 DOI: 10.1007/s10919-023-00448-3
Casey Becker, Russell Conduit, Philippe A. Chouinard, Robin Laycock
{"title":"A Dynamic Disadvantage? Social Perceptions of Dynamic Morphed Emotions Differ from Videos and Photos","authors":"Casey Becker, Russell Conduit, Philippe A. Chouinard, Robin Laycock","doi":"10.1007/s10919-023-00448-3","DOIUrl":"https://doi.org/10.1007/s10919-023-00448-3","url":null,"abstract":"<p>Dynamic face stimuli are increasingly used in face perception research, as increasing evidence shows they are perceived differently from static photographs. One popular method for creating dynamic faces is the dynamic morph, which can animate the transition between expressions by blending two photographs together. Although morphs offer increased experimental control, their unnatural motion differs from the biological facial motion captured in video recordings. This study aimed to compare ratings of emotion intensity and genuineness in video recordings, dynamic morphs, and static photographs of happy, sad, fearful, and angry expressions. We found that video recordings were perceived to have greater emotional intensity than dynamic morphs, and video recordings of happy expressions were perceived as more genuine compared to happy dynamic morphs. Unexpectedly, static photographs and video recordings had similar ratings for genuineness and intensity. Overall, these results suggest that dynamic morphs may be an inappropriate substitute for video recordings, as they may elicit misleading dynamic effects.</p>","PeriodicalId":47747,"journal":{"name":"Journal of Nonverbal Behavior","volume":"61 1","pages":""},"PeriodicalIF":2.1,"publicationDate":"2024-01-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139465063","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Impaired Emotional Mimicry Responses Towards Objectified Women 对被物化女性的情感模仿反应受损
IF 2.1 3区 心理学
Journal of Nonverbal Behavior Pub Date : 2024-01-09 DOI: 10.1007/s10919-023-00449-2
Daniela Ruzzante, Jeroen Vaes
{"title":"Impaired Emotional Mimicry Responses Towards Objectified Women","authors":"Daniela Ruzzante, Jeroen Vaes","doi":"10.1007/s10919-023-00449-2","DOIUrl":"https://doi.org/10.1007/s10919-023-00449-2","url":null,"abstract":"<p>Sexual objectification mostly targets women and occurs whenever they are treated as bodies for the use or consumption of others and stripped of their full humanity. While research has mostly focused on sexual harassment and aggression as the main behavioral consequence of sexual objectification, only a few studies have tried to focus on more subtle consequences towards sexually objectified targets. Spontaneous mimicry is an implicit behavior that influences our social interactions in general. It involves the imitation of other people’s postures, gestures and emotions and allows one to understand other’s emotions and intentions. Therefore, impairments in mimicry behavior are bound to have potentially damaging consequences in everyday social interactions for women who fall victim of sexual objectification. In two studies, using electromyography we measured participants’ mimicry behavior towards objectified and non-objectified women who expressed happiness and anger. Results indicated that both male and female participants attributed less mental and human traits and showed less mimicry behavior towards sexually objectified rather than non-objectified women especially when they expressed happiness. Given the fundamental role of mimicry in creating successful everyday interpersonal interactions, the results of this research advance our understanding on the more subtle, but daily consequences of sexual objectification.</p>","PeriodicalId":47747,"journal":{"name":"Journal of Nonverbal Behavior","volume":"50 1","pages":""},"PeriodicalIF":2.1,"publicationDate":"2024-01-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139410111","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The Effect of Synchrony of Happiness on Facial Expression of Negative Emotion When Lying 快乐的同步性对说谎时面部负面情绪表达的影响
IF 2.1 3区 心理学
Journal of Nonverbal Behavior Pub Date : 2023-12-16 DOI: 10.1007/s10919-023-00447-4
{"title":"The Effect of Synchrony of Happiness on Facial Expression of Negative Emotion When Lying","authors":"","doi":"10.1007/s10919-023-00447-4","DOIUrl":"https://doi.org/10.1007/s10919-023-00447-4","url":null,"abstract":"<h3>Abstract</h3> <p>Meta-analyses have not shown emotions to be significant predictors of deception. Criticisms of this conclusion argued that individuals must be engaged with each other in higher stake situations for such emotions to manifest, and that these emotions must be evaluated in their verbal context (Frank and Svetieva in J Appl Res Memory Cognit 1:131–133, 10.1016/j.jarmac.2012.04.006, 2012). This study examined behavioral synchrony as a marker of engagement in higher stakes truthful and deceptive interactions, and then compared the differences in facial expressions of fear, contempt, disgust, anger, and sadness not consistent with the verbal content. Forty-eight pairs of participants were randomly assigned to interviewer and interviewee, and the interviewee was assigned to steal either a watch or a ring and to lie about the item they stole, and tell the truth about the other, under conditions of higher stakes of up to $30 rewards for successful deception, and $0 plus having to write a 15-min essay for unsuccessful deception. The interviews were coded for expression of emotions using EMFACS (Friesen and Ekman in EMFACS-7; emotional facial action coding system, 1984). Synchrony was demonstrated by the pairs of participants expressing overlapping instances of happiness (AU6 + 12). A 3 (low, moderate, high synchrony) × 2 (truth, lie) mixed-design ANOVA found that negative facial expressions of emotion were a significant predictor of deception, but only when they were not consistent with the verbal content, in the moderate and high synchrony conditions. This finding is consistent with data and theorizing that shows that with higher stakes, or with higher engagement, emotions can be a predictor of deception.</p>","PeriodicalId":47747,"journal":{"name":"Journal of Nonverbal Behavior","volume":"239 1","pages":""},"PeriodicalIF":2.1,"publicationDate":"2023-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138683259","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信