用类内相关系数评价基于spice的软件过程评价中的类间一致性

Hyung-Min Park, Ho-Won Jung
{"title":"用类内相关系数评价基于spice的软件过程评价中的类间一致性","authors":"Hyung-Min Park, Ho-Won Jung","doi":"10.1109/QSIC.2003.1319115","DOIUrl":null,"url":null,"abstract":"As software process assessment (SPA) involves a subjective procedure, its reliability is an important issue. Two types of reliability have intensively been investigated in SPA: internal consistency (internal reliability) and interrater agreement (external reliability). This study investigates interrater agreement. Cohen's Kappa coefficient has been a popular measure for estimating interrater agreement. However, the application of Kappa coefficient in certain situations is incorrect due to the \"Kappa Paradoxes\". To cope with the insufficiency of Kappa coefficient, this study applied the intraclass correlation coefficient (ICC) to estimate interrater agreement. The ICC has not been employed in the SPA context. In addition, we examined the stability of the estimated ICC value by using a bootstrap resampling method. Results show that ICC could be applied where the Kappa coefficient could not be applied, but not all cases.","PeriodicalId":145980,"journal":{"name":"Third International Conference on Quality Software, 2003. Proceedings.","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2003-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"11","resultStr":"{\"title\":\"Evaluating interrater agreement with intraclass correlation coefficient in SPICE-based software process assessment\",\"authors\":\"Hyung-Min Park, Ho-Won Jung\",\"doi\":\"10.1109/QSIC.2003.1319115\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"As software process assessment (SPA) involves a subjective procedure, its reliability is an important issue. Two types of reliability have intensively been investigated in SPA: internal consistency (internal reliability) and interrater agreement (external reliability). This study investigates interrater agreement. Cohen's Kappa coefficient has been a popular measure for estimating interrater agreement. However, the application of Kappa coefficient in certain situations is incorrect due to the \\\"Kappa Paradoxes\\\". To cope with the insufficiency of Kappa coefficient, this study applied the intraclass correlation coefficient (ICC) to estimate interrater agreement. The ICC has not been employed in the SPA context. In addition, we examined the stability of the estimated ICC value by using a bootstrap resampling method. Results show that ICC could be applied where the Kappa coefficient could not be applied, but not all cases.\",\"PeriodicalId\":145980,\"journal\":{\"name\":\"Third International Conference on Quality Software, 2003. Proceedings.\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2003-11-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"11\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Third International Conference on Quality Software, 2003. Proceedings.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/QSIC.2003.1319115\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Third International Conference on Quality Software, 2003. Proceedings.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/QSIC.2003.1319115","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 11

摘要

软件过程评估是一个主观过程,其可靠性是一个重要的问题。两种类型的可靠性在SPA中得到了深入的研究:内部一致性(内部可靠性)和内部一致性(外部可靠性)。本研究调查了译者间的一致性。科恩的Kappa系数一直是估计互译者一致性的常用方法。然而,由于“Kappa悖论”,Kappa系数在某些情况下的应用是不正确的。针对Kappa系数的不足,本研究采用类内相关系数(intracclass correlation coefficient, ICC)来估计类间一致性。国际商会没有被用于SPA。此外,我们通过使用自举重采样方法检查了估计的ICC值的稳定性。结果表明,在不能应用Kappa系数的情况下,ICC可以应用,但不是所有情况都可以。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Evaluating interrater agreement with intraclass correlation coefficient in SPICE-based software process assessment
As software process assessment (SPA) involves a subjective procedure, its reliability is an important issue. Two types of reliability have intensively been investigated in SPA: internal consistency (internal reliability) and interrater agreement (external reliability). This study investigates interrater agreement. Cohen's Kappa coefficient has been a popular measure for estimating interrater agreement. However, the application of Kappa coefficient in certain situations is incorrect due to the "Kappa Paradoxes". To cope with the insufficiency of Kappa coefficient, this study applied the intraclass correlation coefficient (ICC) to estimate interrater agreement. The ICC has not been employed in the SPA context. In addition, we examined the stability of the estimated ICC value by using a bootstrap resampling method. Results show that ICC could be applied where the Kappa coefficient could not be applied, but not all cases.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信