Weiwen Nie,Ivan Hernandez,Louis Tay,Bo Zhang,Mengyang Cao
{"title":"A comparison of the response-pattern-based faking detection methods.","authors":"Weiwen Nie,Ivan Hernandez,Louis Tay,Bo Zhang,Mengyang Cao","doi":"10.1037/apl0001261","DOIUrl":null,"url":null,"abstract":"The covariance index method, the idiosyncratic item response method, and the machine learning method are the three primary response-pattern-based (RPB) approaches to detect faking on personality tests. However, less is known about how their performance is affected by different practical factors (e.g., scale length, training sample size, proportion of faking participants) and when they perform optimally. In the present study, we systematically compared the three RPB faking detection methods across different conditions in three empirical-data-based resampling studies. Overall, we found that the machine learning method outperforms the other two RPB faking detection methods in most simulation conditions. It was also found that the faking probabilities produced by all three RPB faking detection methods had moderate to strong positive correlations with true personality scores, suggesting that these RPB faking detection methods are likely to misclassify honest respondents with truly high personality trait scores as fakers. Fortunately, we found that the benefit of removing suspicious fakers still outweighs the consequences of misclassification. Finally, we provided practical guidance to researchers and practitioners to optimally implement the machine learning method and offered step-by-step code. (PsycInfo Database Record (c) 2025 APA, all rights reserved).","PeriodicalId":15135,"journal":{"name":"Journal of Applied Psychology","volume":"74 1","pages":""},"PeriodicalIF":9.4000,"publicationDate":"2025-01-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Applied Psychology","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1037/apl0001261","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MANAGEMENT","Score":null,"Total":0}
引用次数: 0
Abstract
The covariance index method, the idiosyncratic item response method, and the machine learning method are the three primary response-pattern-based (RPB) approaches to detect faking on personality tests. However, less is known about how their performance is affected by different practical factors (e.g., scale length, training sample size, proportion of faking participants) and when they perform optimally. In the present study, we systematically compared the three RPB faking detection methods across different conditions in three empirical-data-based resampling studies. Overall, we found that the machine learning method outperforms the other two RPB faking detection methods in most simulation conditions. It was also found that the faking probabilities produced by all three RPB faking detection methods had moderate to strong positive correlations with true personality scores, suggesting that these RPB faking detection methods are likely to misclassify honest respondents with truly high personality trait scores as fakers. Fortunately, we found that the benefit of removing suspicious fakers still outweighs the consequences of misclassification. Finally, we provided practical guidance to researchers and practitioners to optimally implement the machine learning method and offered step-by-step code. (PsycInfo Database Record (c) 2025 APA, all rights reserved).
期刊介绍:
The Journal of Applied Psychology® focuses on publishing original investigations that contribute new knowledge and understanding to fields of applied psychology (excluding clinical and applied experimental or human factors, which are better suited for other APA journals). The journal primarily considers empirical and theoretical investigations that enhance understanding of cognitive, motivational, affective, and behavioral psychological phenomena in work and organizational settings. These phenomena can occur at individual, group, organizational, or cultural levels, and in various work settings such as business, education, training, health, service, government, or military institutions. The journal welcomes submissions from both public and private sector organizations, for-profit or nonprofit. It publishes several types of articles, including:
1.Rigorously conducted empirical investigations that expand conceptual understanding (original investigations or meta-analyses).
2.Theory development articles and integrative conceptual reviews that synthesize literature and generate new theories on psychological phenomena to stimulate novel research.
3.Rigorously conducted qualitative research on phenomena that are challenging to capture with quantitative methods or require inductive theory building.