{"title":"对联想能力进行通用评估:开发和验证多重选择复合远程联想任务。","authors":"Kendall A Mather, Sara J Weston, David M Condon","doi":"10.3758/s13428-024-02422-3","DOIUrl":null,"url":null,"abstract":"<p><p>The assessment of creativity as an individual difference has historically focused on divergent thinking, which is increasingly viewed as involving the associative processes that are also understood to be a key component of creative potential. Research on associative processes has proliferated in many sub-fields, often using Compound Remote Associates (CRA) tasks with an open response format and relatively small participant samples. In the present work, we introduce a new format that is more amenable to large-scale data collection in survey designs, and present evidence for the reliability and validity of CRA measures in general using multiple large samples. Study 1 uses a large, representative dataset (N = 1,323,480) to demonstrate strong unidimensionality and internal consistency (α = .97; ωt = .87), as well as links to individual differences in temperament, cognitive ability, occupation, and job characteristics. Study 2 uses an undergraduate sample (N = 685) to validate the use of a multiple-choice format relative to the traditional approach. Study 3 uses a crowdsourced sample (N = 357) to demonstrate high test-retest reliability of the items (r =.74). Finally, Study 4 uses a sample that overlaps with Study 1 (N = 1,502,922) to provide item response theory (IRT) parameters for a large set of high-quality CRA items that use a multiple-choice response mode, thus facilitating their use in future research on creativity, insight, and related topics.</p>","PeriodicalId":8717,"journal":{"name":"Behavior Research Methods","volume":" ","pages":"1-29"},"PeriodicalIF":4.6000,"publicationDate":"2024-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Scaling a common assessment of associative ability: Development and validation of a multiple-choice compound remote associates task.\",\"authors\":\"Kendall A Mather, Sara J Weston, David M Condon\",\"doi\":\"10.3758/s13428-024-02422-3\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>The assessment of creativity as an individual difference has historically focused on divergent thinking, which is increasingly viewed as involving the associative processes that are also understood to be a key component of creative potential. Research on associative processes has proliferated in many sub-fields, often using Compound Remote Associates (CRA) tasks with an open response format and relatively small participant samples. In the present work, we introduce a new format that is more amenable to large-scale data collection in survey designs, and present evidence for the reliability and validity of CRA measures in general using multiple large samples. Study 1 uses a large, representative dataset (N = 1,323,480) to demonstrate strong unidimensionality and internal consistency (α = .97; ωt = .87), as well as links to individual differences in temperament, cognitive ability, occupation, and job characteristics. Study 2 uses an undergraduate sample (N = 685) to validate the use of a multiple-choice format relative to the traditional approach. Study 3 uses a crowdsourced sample (N = 357) to demonstrate high test-retest reliability of the items (r =.74). Finally, Study 4 uses a sample that overlaps with Study 1 (N = 1,502,922) to provide item response theory (IRT) parameters for a large set of high-quality CRA items that use a multiple-choice response mode, thus facilitating their use in future research on creativity, insight, and related topics.</p>\",\"PeriodicalId\":8717,\"journal\":{\"name\":\"Behavior Research Methods\",\"volume\":\" \",\"pages\":\"1-29\"},\"PeriodicalIF\":4.6000,\"publicationDate\":\"2024-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Behavior Research Methods\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://doi.org/10.3758/s13428-024-02422-3\",\"RegionNum\":2,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2024/6/5 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"Q1\",\"JCRName\":\"PSYCHOLOGY, EXPERIMENTAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Behavior Research Methods","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.3758/s13428-024-02422-3","RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/6/5 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"PSYCHOLOGY, EXPERIMENTAL","Score":null,"Total":0}
Scaling a common assessment of associative ability: Development and validation of a multiple-choice compound remote associates task.
The assessment of creativity as an individual difference has historically focused on divergent thinking, which is increasingly viewed as involving the associative processes that are also understood to be a key component of creative potential. Research on associative processes has proliferated in many sub-fields, often using Compound Remote Associates (CRA) tasks with an open response format and relatively small participant samples. In the present work, we introduce a new format that is more amenable to large-scale data collection in survey designs, and present evidence for the reliability and validity of CRA measures in general using multiple large samples. Study 1 uses a large, representative dataset (N = 1,323,480) to demonstrate strong unidimensionality and internal consistency (α = .97; ωt = .87), as well as links to individual differences in temperament, cognitive ability, occupation, and job characteristics. Study 2 uses an undergraduate sample (N = 685) to validate the use of a multiple-choice format relative to the traditional approach. Study 3 uses a crowdsourced sample (N = 357) to demonstrate high test-retest reliability of the items (r =.74). Finally, Study 4 uses a sample that overlaps with Study 1 (N = 1,502,922) to provide item response theory (IRT) parameters for a large set of high-quality CRA items that use a multiple-choice response mode, thus facilitating their use in future research on creativity, insight, and related topics.
期刊介绍:
Behavior Research Methods publishes articles concerned with the methods, techniques, and instrumentation of research in experimental psychology. The journal focuses particularly on the use of computer technology in psychological research. An annual special issue is devoted to this field.