August Håkan Nilsson,J Malte Runge,Adithya V Ganesan,Carl Viggo N G Lövenstierne,Nikita Soni,Oscar N E Kjell
{"title":"自动内隐动机编码至少和人类一样准确,而且速度快99%。","authors":"August Håkan Nilsson,J Malte Runge,Adithya V Ganesan,Carl Viggo N G Lövenstierne,Nikita Soni,Oscar N E Kjell","doi":"10.1037/pspp0000544","DOIUrl":null,"url":null,"abstract":"Implicit motives, nonconscious needs that influence individuals' behaviors and shape their emotions, have been part of personality research for nearly a century but differ from personality traits. The implicit motive assessment is very resource-intensive, involving expert coding of individuals' written stories about ambiguous pictures, and has hampered implicit motive research. Using large language models and machine learning techniques, we aimed to create high-quality implicit motive models that are easy for researchers to use. We trained models to code the need for power, achievement, and affiliation (N = 85,028 sentences). The person-level assessments converged strongly with the holdout data, intraclass correlation coefficient, ICC(1,1) = .85, .87, and .89 for achievement, power, and affiliation, respectively. We demonstrated causal validity by reproducing two classical experimental studies that aroused implicit motives. We let three coders recode sentences where our models and the original coders strongly disagreed. We found that the new coders agreed with our models in 85% of the cases (p < .001, ϕ = .69). Using topic and word embedding analyses, we found specific language associated with each motive to have a high face validity. We argue that these models can be used in addition to, or instead of, human coders. We provide a free, user-friendly framework in the established R-package text and a tutorial for researchers to apply the models to their data, as these models reduce the coding time by over 99% and require no cognitive effort for coding. We hope this coding automation will facilitate a historical implicit motive research renaissance. (PsycInfo Database Record (c) 2025 APA, all rights reserved).","PeriodicalId":16691,"journal":{"name":"Journal of personality and social psychology","volume":"101 1","pages":""},"PeriodicalIF":6.4000,"publicationDate":"2025-04-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Automatic implicit motive codings are at least as accurate as humans' and 99% faster.\",\"authors\":\"August Håkan Nilsson,J Malte Runge,Adithya V Ganesan,Carl Viggo N G Lövenstierne,Nikita Soni,Oscar N E Kjell\",\"doi\":\"10.1037/pspp0000544\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Implicit motives, nonconscious needs that influence individuals' behaviors and shape their emotions, have been part of personality research for nearly a century but differ from personality traits. The implicit motive assessment is very resource-intensive, involving expert coding of individuals' written stories about ambiguous pictures, and has hampered implicit motive research. Using large language models and machine learning techniques, we aimed to create high-quality implicit motive models that are easy for researchers to use. We trained models to code the need for power, achievement, and affiliation (N = 85,028 sentences). The person-level assessments converged strongly with the holdout data, intraclass correlation coefficient, ICC(1,1) = .85, .87, and .89 for achievement, power, and affiliation, respectively. We demonstrated causal validity by reproducing two classical experimental studies that aroused implicit motives. We let three coders recode sentences where our models and the original coders strongly disagreed. We found that the new coders agreed with our models in 85% of the cases (p < .001, ϕ = .69). Using topic and word embedding analyses, we found specific language associated with each motive to have a high face validity. We argue that these models can be used in addition to, or instead of, human coders. We provide a free, user-friendly framework in the established R-package text and a tutorial for researchers to apply the models to their data, as these models reduce the coding time by over 99% and require no cognitive effort for coding. We hope this coding automation will facilitate a historical implicit motive research renaissance. (PsycInfo Database Record (c) 2025 APA, all rights reserved).\",\"PeriodicalId\":16691,\"journal\":{\"name\":\"Journal of personality and social psychology\",\"volume\":\"101 1\",\"pages\":\"\"},\"PeriodicalIF\":6.4000,\"publicationDate\":\"2025-04-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of personality and social psychology\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://doi.org/10.1037/pspp0000544\",\"RegionNum\":1,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"PSYCHOLOGY, SOCIAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of personality and social psychology","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1037/pspp0000544","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, SOCIAL","Score":null,"Total":0}
Automatic implicit motive codings are at least as accurate as humans' and 99% faster.
Implicit motives, nonconscious needs that influence individuals' behaviors and shape their emotions, have been part of personality research for nearly a century but differ from personality traits. The implicit motive assessment is very resource-intensive, involving expert coding of individuals' written stories about ambiguous pictures, and has hampered implicit motive research. Using large language models and machine learning techniques, we aimed to create high-quality implicit motive models that are easy for researchers to use. We trained models to code the need for power, achievement, and affiliation (N = 85,028 sentences). The person-level assessments converged strongly with the holdout data, intraclass correlation coefficient, ICC(1,1) = .85, .87, and .89 for achievement, power, and affiliation, respectively. We demonstrated causal validity by reproducing two classical experimental studies that aroused implicit motives. We let three coders recode sentences where our models and the original coders strongly disagreed. We found that the new coders agreed with our models in 85% of the cases (p < .001, ϕ = .69). Using topic and word embedding analyses, we found specific language associated with each motive to have a high face validity. We argue that these models can be used in addition to, or instead of, human coders. We provide a free, user-friendly framework in the established R-package text and a tutorial for researchers to apply the models to their data, as these models reduce the coding time by over 99% and require no cognitive effort for coding. We hope this coding automation will facilitate a historical implicit motive research renaissance. (PsycInfo Database Record (c) 2025 APA, all rights reserved).
期刊介绍:
Journal of personality and social psychology publishes original papers in all areas of personality and social psychology and emphasizes empirical reports, but may include specialized theoretical, methodological, and review papers.Journal of personality and social psychology is divided into three independently edited sections. Attitudes and Social Cognition addresses all aspects of psychology (e.g., attitudes, cognition, emotion, motivation) that take place in significant micro- and macrolevel social contexts.