Enhancing Content Validity Assessment With Item Response Theory Modeling.

IF 3.2 2区 心理学 Q1 PSYCHOLOGY, MULTIDISCIPLINARY
Rodrigo Schames Kreitchmann, Pablo Nájera, Susana Sanz, Miguel A Sorrel
{"title":"Enhancing Content Validity Assessment With Item Response Theory Modeling.","authors":"Rodrigo Schames Kreitchmann, Pablo Nájera, Susana Sanz, Miguel A Sorrel","doi":"10.7334/psicothema2023.208","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>Ensuring the validity of assessments requires a thorough examination of the test content. Subject matter experts (SMEs) are commonly employed to evaluate the relevance, representativeness, and appropriateness of the items. This article proposes incorporating item response theory (IRT) into model assessments conducted by SMEs. Using IRT allows for the estimation of discrimination and threshold parameters for each SME, providing evidence of their performance in differentiating relevant from irrelevant items, thus facilitating the detection of suboptimal SME performance while improving item relevance scores.</p><p><strong>Method: </strong>Use of IRT was compared to traditional validity indices (content validity index and Aiken's V) in the evaluation of items. The aim was to assess the SMEs' accuracy in identifying whether items were designed to measure conscientiousness or not, and predicting their factor loadings.</p><p><strong>Results: </strong>The IRT-based scores effectively identified conscientiousness items (R2 = 0.57) and accurately predicted their factor loadings (R2 = 0.45). These scores demonstrated incremental validity, explaining 11% more variance than Aiken's V and up to 17% more than the content validity index.</p><p><strong>Conclusions: </strong>Modeling SME assessments with IRT improves item alignment and provides better predictions of factor loadings, enabling improvement of the content validity of measurement instruments.</p>","PeriodicalId":48179,"journal":{"name":"Psicothema","volume":"36 2","pages":"145-153"},"PeriodicalIF":3.2000,"publicationDate":"2024-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Psicothema","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.7334/psicothema2023.208","RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

Abstract

Background: Ensuring the validity of assessments requires a thorough examination of the test content. Subject matter experts (SMEs) are commonly employed to evaluate the relevance, representativeness, and appropriateness of the items. This article proposes incorporating item response theory (IRT) into model assessments conducted by SMEs. Using IRT allows for the estimation of discrimination and threshold parameters for each SME, providing evidence of their performance in differentiating relevant from irrelevant items, thus facilitating the detection of suboptimal SME performance while improving item relevance scores.

Method: Use of IRT was compared to traditional validity indices (content validity index and Aiken's V) in the evaluation of items. The aim was to assess the SMEs' accuracy in identifying whether items were designed to measure conscientiousness or not, and predicting their factor loadings.

Results: The IRT-based scores effectively identified conscientiousness items (R2 = 0.57) and accurately predicted their factor loadings (R2 = 0.45). These scores demonstrated incremental validity, explaining 11% more variance than Aiken's V and up to 17% more than the content validity index.

Conclusions: Modeling SME assessments with IRT improves item alignment and provides better predictions of factor loadings, enabling improvement of the content validity of measurement instruments.

利用项目反应理论模型加强内容有效性评估。
背景:确保评估的有效性需要对测试内容进行彻底检查。通常会聘请主题专家(SMEs)来评估项目的相关性、代表性和适当性。本文建议将项目反应理论(IRT)纳入中小型企业进行的模型评估中。使用 IRT 可以估算出每个中小型企业的区分度和阈值参数,为他们区分相关和不相关项目的表现提供证据,从而有助于发现中小型企业的次优表现,同时提高项目相关性得分:在评估项目时,将 IRT 与传统效度指数(内容效度指数和艾肯 V)进行了比较。目的是评估中小型企业在确定项目是否旨在测量自觉性以及预测其因子载荷方面的准确性:结果:基于 IRT 的评分有效地识别了自觉性项目(R2 = 0.57),并准确地预测了其因子负荷(R2 = 0.45)。这些分数显示了增量效度,比艾肯 V 解释的方差多 11%,比内容效度指数多 17%:结论:用 IRT 对中小型企业评估进行建模,可以改善项目的一致性,更好地预测因子负荷,从而提高测量工具的内容效度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Psicothema
Psicothema PSYCHOLOGY, MULTIDISCIPLINARY-
CiteScore
6.50
自引率
16.70%
发文量
69
审稿时长
24 weeks
期刊介绍: La revista Psicothema fue fundada en Asturias en 1989 y está editada conjuntamente por la Facultad y el Departamento de Psicología de la Universidad de Oviedo y el Colegio Oficial de Psicólogos del Principado de Asturias. Publica cuatro números al año. Se admiten trabajos tanto de investigación básica como aplicada, pertenecientes a cualquier ámbito de la Psicología, que previamente a su publicación son evaluados anónimamente por revisores externos. Psicothema está incluida en las bases de datos nacionales e internacionales más relevantes, entre las que cabe destacar Psychological Abstracts, Current Contents y MEDLINE/Index Medicus, entre otras. Además, figura en las listas de Factor de Impacto del Journal Citation Reports. Psicothema es una revista abierta a cualquier enfoque u orientación psicológica que venga avalada por la fuerza de los datos y los argumentos, y en la que encuentran acomodo todos los autores que sean capaces de convencer a los revisores de que sus manuscritos tienen la calidad para ser publicados. Psicothema es una revista de acceso abierto lo que significa que todo el contenido está a disposición de cualquier usuario o institución sin cargo alguno. Los usuarios pueden leer, descargar, copiar, distribuir, imprimir, buscar, o realizar enlaces a los textos completos de esta revista sin pedir permiso previo al editor o al autor, siempre y cuando la fuente original sea referenciada. Para acervos y repositorios, se prefiere que la cobertura se realice mediante enlaces a la propia web de Psicothema. Nos parece que una apuesta decidida por la calidad es el mejor modo de servir a nuestros lectores, cuyas sugerencias siempre serán bienvenidas.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信