{"title":"探索顺序模型和 IRTree 模型中特定项目因素的影响。","authors":"Weicong Lyu, Daniel M Bolt, Samuel Westby","doi":"10.1007/s11336-023-09912-x","DOIUrl":null,"url":null,"abstract":"<p><p>Test items for which the item score reflects a sequential or IRTree modeling outcome are considered. For such items, we argue that item-specific factors, although not empirically measurable, will often be present across stages of the same item. In this paper, we present a conceptual model that incorporates such factors. We use the model to demonstrate how the varying conditional distributions of item-specific factors across stages become absorbed into the stage-specific item discrimination and difficulty parameters, creating ambiguity in the interpretations of item and person parameters beyond the first stage. We discuss implications in relation to various applications considered in the literature, including methodological studies of (1) repeated attempt items; (2) answer change/review, (3) on-demand item hints; (4) item skipping behavior; and (5) Likert scale items. Our own empirical applications, as well as several examples published in the literature, show patterns of violations of item parameter invariance across stages that are highly suggestive of item-specific factors. For applications using sequential or IRTree models as analytical models, or for which the resulting item score might be viewed as outcomes of such a process, we recommend (1) regular inspection of data or analytic results for empirical evidence (or theoretical expectations) of item-specific factors; and (2) sensitivity analyses to evaluate the implications of item-specific factors for the intended inferences or applications.</p>","PeriodicalId":54534,"journal":{"name":"Psychometrika","volume":null,"pages":null},"PeriodicalIF":2.9000,"publicationDate":"2023-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Exploring the Effects of Item-Specific Factors in Sequential and IRTree Models.\",\"authors\":\"Weicong Lyu, Daniel M Bolt, Samuel Westby\",\"doi\":\"10.1007/s11336-023-09912-x\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Test items for which the item score reflects a sequential or IRTree modeling outcome are considered. For such items, we argue that item-specific factors, although not empirically measurable, will often be present across stages of the same item. In this paper, we present a conceptual model that incorporates such factors. We use the model to demonstrate how the varying conditional distributions of item-specific factors across stages become absorbed into the stage-specific item discrimination and difficulty parameters, creating ambiguity in the interpretations of item and person parameters beyond the first stage. We discuss implications in relation to various applications considered in the literature, including methodological studies of (1) repeated attempt items; (2) answer change/review, (3) on-demand item hints; (4) item skipping behavior; and (5) Likert scale items. Our own empirical applications, as well as several examples published in the literature, show patterns of violations of item parameter invariance across stages that are highly suggestive of item-specific factors. For applications using sequential or IRTree models as analytical models, or for which the resulting item score might be viewed as outcomes of such a process, we recommend (1) regular inspection of data or analytic results for empirical evidence (or theoretical expectations) of item-specific factors; and (2) sensitivity analyses to evaluate the implications of item-specific factors for the intended inferences or applications.</p>\",\"PeriodicalId\":54534,\"journal\":{\"name\":\"Psychometrika\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":2.9000,\"publicationDate\":\"2023-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Psychometrika\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://doi.org/10.1007/s11336-023-09912-x\",\"RegionNum\":2,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2023/6/16 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Psychometrika","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1007/s11336-023-09912-x","RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2023/6/16 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
Exploring the Effects of Item-Specific Factors in Sequential and IRTree Models.
Test items for which the item score reflects a sequential or IRTree modeling outcome are considered. For such items, we argue that item-specific factors, although not empirically measurable, will often be present across stages of the same item. In this paper, we present a conceptual model that incorporates such factors. We use the model to demonstrate how the varying conditional distributions of item-specific factors across stages become absorbed into the stage-specific item discrimination and difficulty parameters, creating ambiguity in the interpretations of item and person parameters beyond the first stage. We discuss implications in relation to various applications considered in the literature, including methodological studies of (1) repeated attempt items; (2) answer change/review, (3) on-demand item hints; (4) item skipping behavior; and (5) Likert scale items. Our own empirical applications, as well as several examples published in the literature, show patterns of violations of item parameter invariance across stages that are highly suggestive of item-specific factors. For applications using sequential or IRTree models as analytical models, or for which the resulting item score might be viewed as outcomes of such a process, we recommend (1) regular inspection of data or analytic results for empirical evidence (or theoretical expectations) of item-specific factors; and (2) sensitivity analyses to evaluate the implications of item-specific factors for the intended inferences or applications.
期刊介绍:
The journal Psychometrika is devoted to the advancement of theory and methodology for behavioral data in psychology, education and the social and behavioral sciences generally. Its coverage is offered in two sections: Theory and Methods (T& M), and Application Reviews and Case Studies (ARCS). T&M articles present original research and reviews on the development of quantitative models, statistical methods, and mathematical techniques for evaluating data from psychology, the social and behavioral sciences and related fields. Application Reviews can be integrative, drawing together disparate methodologies for applications, or comparative and evaluative, discussing advantages and disadvantages of one or more methodologies in applications. Case Studies highlight methodology that deepens understanding of substantive phenomena through more informative data analysis, or more elegant data description.